Get in touch
555-555-5555
mymail@mailservice.com

News & Views: Connecting the Metadata Dots

Dan Pollock and Ann Michael • September 21, 2021

Connecting the Metadata Dots: An Hypothesis, A Method, and a Peer Review Walk into a Pub…

By Heather Staines


Background


A new platform called Octopus has been in the news lately, as it was referenced in a recent UK Research and Innovation (UKRI) policy statement. Octopus defines eight types of content objects that might be added throughout the stages of the research and publication process, including an hypothesis, a method, data, an analysis or a peer review. The conversation has turned around whether such a new platform with (so far) limited funding can change the way scholarly communications happens.


Indeed, many elements of the Octopus platform have already been with us for some time, perhaps not all collected centrally on one service. The conversation got me thinking again about the variety of content types and versions that make up our scholarly communications ecosystem and what will be necessary to help researchers utilize them effectively.


Getting smaller: Micropublications


One aspect of Octopus is the notion that a researcher would post several micropublications throughout their research process. Micropublications are hardly new. On Micropublication.org you can see peer reviewed micropublications that include brief results, novel findings, negative or reproduced results, and those that may lack a broader narrative. Each has a DOI, is indexed, and is often deposited in referential databases such as: WormbaseFlybaseXenbaseArabidopsis Information Resource, and more. Researchers can build upon their existing publications, and others can benefit from this data also.

Registered Reports are another form of micropublication, which puts the emphasis on the research question and quality of methodology. Peer review is done even before data is collected. Authors who follow the methodology that they have registered can be routed through a publication workflow when their research is complete. Registered Reports might not be so “micro” but they are indicative of a preliminary research stage. Versions of publications, such as those that result from updates posted to preprint servers, for example, represent a later stage.


Search and you will find (hopefully!) Metadata Connections


As an author, my work sits on online publisher platforms, the open web, in institutional repositories, to name a few places. Related content like reviews, annotations, media mentions, articles citing my work, and resources I have cited exist in still other places. A journal article might live on SpringerLink, in PubMed, Ovid, JSTOR, or EBSCOhost. An ebook might be published on some combination of Project Muse, Knowledge Unlatched, Open Library of the Humanities, or PubPub. Standards and best practices around metadata and identifiers help us parse these versions, but they may not help us understand how each connects to the related-objects that we might also benefit from exploring.


To make this distributed ecosystem function in a way that is useful to researchers, connections between items need to be categorized and machine-readable. A few years ago, Crossref introduced DOIs specifically for application to units of publication smaller than articles or book chapters, such as peer reviews and annotations. The metadata schema provides for asserting the connection between the peer review and the item being reviewed. While the schema isn’t perfect, data show nearly 200,000 deposits for “peer reviews.” Crossref also released a tool called Event Data, where activities connected to content with DOIs can be deposited. This tool tracks annotations, for example, on content that has a DOI. It also notes annotations which contain reference to other content items with DOIs, effectively linking the items together.


An interesting place to look at how these connections are evolving is the open source platform PubPub maintained by my former employer the Knowledge Futures Group, which introduced the concept of Linked Pubs (chapter, article, or other unit of content) about a year ago. Creators can use the Crossref relationship schema to assert different relationships between Pubs on the platform and content elsewhere on the web. As of this writing, there are 1,884 Linked Pubs in 56 different communities (or groupings of collaborative activity). Review and supplement are the most used types, at 28% and 25%, respectively. Version is next at about 19%, with a more or less even distribution between other types (mostly commentary/reply/etc.) after that. Interestingly, there's a good balance between people linking to other Pubs vs. content that lives elsewhere on the web, with about 40% linking to other Pubs, and 60% external content. You can check out these examples: RTI Press uses Version to create multimedia iterations of previously published content, and Fermentology utilizes Supplemental to add materials for courses.


Resource Description


Crossref Relationship Schema 

Documents relationships between different research objects


DocMaps 

Provides machine-readable data and context about how community groups and peer review platforms are evaluating preprints to facilitate the exchange, aggregation and publishing of peer reviews within a distributed, interoperable infrastructure


NISO Access and License Indicators (ALI) 

A project to add metadata and indicators that would allow metadata users, such as content platforms, to filter or target subsets of license information


NISO Open Discovery Initiative (ODI) 

A technical recommendation for data exchange including data formats, method of delivery, usage reporting, frequency of updates and rights of use


NISO ResourceSync 

Researches, develops, prototypes, tests, and deploys mechanisms for the large-scale synchronization of web resources


Publishing Status Ontology 

An ontology designed to characterize the publication status of documents at each stage of the publishing process (draft, submitted, under review, etc.)


Publishing Workflow Ontology 

A simple ontology for describing the steps in the workflow associated with the publication of a document or other publication entity


Peer review in the wild: Annotations, blog posts, curated feeds, and overlay journals


With new flavors of open peer review, small publications of this type are growing. Peer review now takes place in a variety of settings and the resulting reviews may not be hosted alongside the content itself. Overlay journals such as Rapid Reviews: Covid 19 and Current Cities publish reviews about content hosted elsewhere, on a preprint server, for example.

Less formal versions of an overlay model include blogs such as PreLights from the Company of Biologists. In this project early career researchers write reviews to highlight preprints that interest them. This gives them experience in both reviewing and in blogging, and, more than half of the time, the preprint author ends up corresponding with them around their feedback. PreLights data shows that 93% of these preprints are published in a journal within two years.


But how can we pull all of this activity together in a useful way? Sciety is a new project from eLife which connects communities evaluating preprints, curating the resulting reviews, and facilitating discovery through social media. This effort seeks to move the review and curation process to the post-publication space. Sciety works with existing reviewer communities and inspires the creation of new ones. Such reviewer feedback can be connected back to the publication space to close the loop. bioRxiv has introduced a dashboard feature that pulls in such reviews from the community and also from publisher initiatives to make them discoverable.


Where do we go from here?


A few years back a publisher friend on his way to a Crossref DOI committee meeting opined: “What is a publisher to do to manage DOIs now that content is hosted in so many different places?” To me the answer was clear. The beauty of it, my friend, is that the publisher no longer has to manage it. But don’t get me wrong, there is still a role for publishers (and librarians and researchers) to play as trusted third parties in asserting relationships between digital objects: this article is the same as this document; this dataset is related to this experiment; this review is connected to this book chapter; this is a later version of that.


Much work still needs to be done, around the creation of metadata useful for discovery and classification, search interfaces optimized to make it clear to the user which types of things their search results contain, as well as how they connect to one another.


In addition to Crossref initiatives, NISO has a number of projects that may be useful, including the Open Discovery Initiative (ODI), Access and License Indicators (ALI), and ResourceSync to help with versioning. A project called DocMaps, a collaboration between Cold Spring Harbor Laboratory (CSHL), eLife/Sciety, EMBO, and the Knowledge Futures Group, uses as a framework a “set of agreed-upon conventions for aligning editorial processes with the Publishing Status Ontology (PSO) and the Publishing Workflow Ontology (PWO) and for expressing these events in a domain-specific language that can be easily interpreted by machines and humans alike.”


We’re still at the beginning of this new journey, but I look forward to seeing where we go.


This article is © 2021 Delta Think, Inc. It is published under a Creative Commons Attribution-NonCommercial 4.0 International License. Please do get in touch if you want to use it in other contexts – we’re usually pretty accommodating.


TOP HEADLINES


EUt+ alliance and OpenAIRE join forces for open science – September 8, 2021

"The OpenAIRE-Nexus consortium and the European University of Technology (EUt+), have announced a new cooperation agreement to improve the integration and discoverability of research results, to showcase the synergies and connections and aggregate the work from all partners across this pan-European network of Universities."


New report confirms positive momentum for EU open science – September 6, 2021

"The Commission released the results and datasets of a study monitoring the open access mandate in Horizon 2020. With a steadily increase over the years and an average success rate of 83% open access to scientific publications, the European Commission is at the forefront of research and innovation funders concluded the consortium formed by the analysis company PPMI (Lithuania), research and innovation centre Athena (Greece) and Maastricht University (the Netherlands)."


cOAlition S statement on Open Access for academic books – September 2, 2021

"cOAlition S recognizes that academic book publishing is very different from journal publishing. Our commitment is to make progress towards full open access for academic books as soon as possible, in the understanding that standards and funding models may need more time to develop. Rather than to decree a uniform policy on OA books, we have therefore decided to formulate a set of recommendations regarding academic books – in line with Plan S principles – that all cOAlition S organisations will seek to adopt within their own remits and jurisdictions."


Karger Publishers Advances Open Access with Plan S Aligned Transformative Journals – August 25, 2021

"Karger Publishers is adopting the Plan S-aligned ‘Transformative Journal’ model for a growing number of journals, acting on its commitment to accelerate the transition to Open Access (OA). Seven Karger Publishers journals have committed to the Transformative Journals model so far."


SCOSS Campaign: DOAB/OAPEN reaches important funding milestone – August 17, 2021

"The Directory of Open Access Books (DOAB) and OAPEN, jointly part of SCOSS’s second funding cycle, has met a significant milestone by reaching its three-year funding goal of 505,000 Euros in about 18 months, despite the COVID-19 challenge."


OA JOURNAL LAUNCHES


September 1, 2021

AAS Journals Will Switch to Open Access 

"The American Astronomical Society (AAS), a leading nonprofit professional association for astronomers, announced the switch of its prestigious journals to fully open access (OA) as of 1 January 2022. Under this change, all articles in the AAS journal portfolio will be immediately open for anyone to freely read."


August 23, 2021

RSC New Journal Launch: Energy Advances

RSC's "new Gold Open Access journal Energy Advances focuses on energy science, and in particular the interdisciplinarity required for exciting breakthroughs in the field. Energy Advances welcomes research from any related discipline including materials science, engineering, technology, biosciences and chemistry."

 

By Dan Pollock and Ann Michael February 20, 2025
Overview A recent post on the Open Café listserv posed a question about the true extent of fee-free open access publishing, but it noted the incomplete coverage of the data cited. We have more comprehensive data, but just as we started our analysis, DeepSeek’s release sent markets into turmoil. The stage was set for a timely experiment. We first answer the question using our data. Then we see how the AI did. Background What proportion of open access is not paid for by APCs? In discussing this, a recent Open Café listserv post cited studies by Walt Crawford – a librarian, well-known in the academic library and OA communities for his analysis of open access. He has paid particular attention to “diamond” OA journals, which charge neither readers nor authors. His studies are based on data from the Directory of Open Access journals ( DOAJ ). Excellent though both sources may be – and, full disclosure, we contribute to the DOAJ – the DOAJ’s remit covers only fully OA (“gold”) journals. As listserv founder Rick Anderson noted, “By counting only articles published in DOAJ-listed journals, Crawford’s studies radically _undercount_ the number of APC-funded OA articles published – because DOAJ does not list hybrid journals, which always charge an APC for OA and which produce a lot of genuinely OA articles (though exactly how many, no one knows).” Using our data Actually, we do know … or at least have some fair estimates of hybrid OA. Our data allows us to determine the share of open access output in APC-free journals, as follows.
By Dan Pollock and Ann Michael February 11, 2025
Overview Following the 2024 US election, the new US administration has instructed employees in some key federal agencies to retract publications arising from federally funded research. This is to allow representatives of the administration to review the language used, to ensure it is consistent with the administration’s political ideology. In this special edition of News & Views, we quantify how many papers might be affected and estimate their share of scholarly publishers’ output. The initial numbers may be small, but we suggest the effects on scholarly publishing could be profound. Background On 20 January 2025, Donald J. Trump took office as the 47th President of the United States. Within hours he signed an Executive Order 1 (EO) 14168 proclaiming that the US government would only recognize two sexes, and ending diversity, equity, and inclusion (DEI) programs inside federal agencies. The following day, his administration instructed federal health agencies to pause all external communications – “such as health advisories, weekly scientific reports, updates to websites and social media posts” – pending their review by presidential appointees. These instructions were delivered to staff at agencies inside the Department of Health and Human Services (DHSS), including the Food and Drug Administration (FDA), the Centers for Disease Control (CDC) and Prevention, and the National Institutes of Health (NIH). The events that followed are important, as they directly affect scholarly papers and our analysis. A memo on 29 January instructed agencies to “end all agency programs that … promote or reflect gender ideology” as defined in the EO. Department heads were instructed to immediately review and terminate any “programs, contracts, and grants” that “promote or inculcate gender ideology.” Among other things, they were to remove any public-facing documents or policies that are trans-affirming and replace the term “gender” with “sex” on official documents. By the start of February, more than 8000 web pages across more than a dozen US government websites were taken down . These included over 3000 pages from the CDC (including 1000 research articles filed under preventing chronic disease, STD treatment guidelines , information about Alzheimer’s warning signs, overdose prevention training , and vaccine guidelines for pregnancy). Other departments affected included the FDA (some clinical trials), the Office of Scientific and Technical Information (the OSTP, removing papers in optics, chemistry and experimental medicine), the Health Resources and Services Administration (covering care for women with opioid addictions, and an FAQ about the Mpox vaccine). Around this time, it further emerged that CDC staff were sent an email directing them to withdraw manuscripts that had been accepted, but not yet published, that did not comply with the EO. Agency staff members were given a list of about 20 forbidden terms, including gender, transgender, pregnant person, pregnant people, LGBT, transsexual, nonbinary, assigned male at birth, assigned female at birth, biologically male, biologically female, and he/she/they/them. All references to DEI and inclusion are also to be removed. The effects of the EO Commenting on the merits of policy and ideology lies beyond our remit. However, when these matters affect the scholarly record – as they clearly do here – then they are of interest for our analyses. Specifically, what might the effects of the EO be on the publication of papers, and what effects might accrue from withdrawal of research funding? If federal agencies are being instructed to withhold or withdraw submissions, then, to quantify what this might mean to publishers, we have estimated the volume of output from a few key federal agencies. It is summarized in the following chart. 
By Lori Carlin January 23, 2025
Emerging technologies are reshaping how we create, distribute, and consume content. Publishers face the critical task of making smart technology investments to stay competitive and enable strategic objectives. How do you ensure that your next tech purchase aligns with your organization's needs and goals? Enter the needs assessment process – your roadmap to making informed, strategic technology decisions. From defining clear objectives to creating a comprehensive RFP, these best practices will help you navigate the decision-making process with confidence and ensure that your investments deliver value for your organization and your customers. Technology is not a solution; it is a tool. The temptation to adopt technology without a clear definition of what you are trying to achieve is an all too common (and usually very costly) mistake. Does your strategy include delivering a more personalized experience for your users? A customer data platform may be the right technology. Interested in using AI to build research integrity into your editorial process? Perhaps it’s time to revisit the capabilities of your editorial management system. Looking to support education and learning for students, faculty, and professional learners? Maybe it is time to evaluate formal learning management systems. Once you are confident about what you are seeking to achieve, the real work begins. Here are the key components that will help lay the foundation for a successful process from inception to deployment: Analyze Current State: Audit existing systems and processes to understand current capabilities and limitations. Conduct a Gap Analysis: Identify gaps between current capabilities and desired future state. Collect and Analyze Data: Gather qualitative and quantitative data from staff, users, customers, industry benchmarks, and about existing systems. Consider Resources and Constraints: Assess available resources, including budget, skills, and time. Research Solutions: Investigate potential technologies and/or types of solutions that could address identified gaps. Prioritize Needs: Work with stakeholders to prioritize needs based on impact and feasibility. Create RFP: After identifying prioritized needs and potential solutions, develop an RFP that clearly outlines project objectives, specific requirements, evaluation criteria, budget, and timelines. Distribute the RFP: Identify vendors with fit for purpose solutions and capabilities and distribute. Evaluate Proposals: Review vendor responses against established criteria and prioritize them based on how well they meet your needs. Plan for Adoption and Training: Consider the change management aspects of introducing new technology and processes. Be sure to develop a plan for user adoption, training, and ongoing support in your new systems. Technology as a Strategic Ally A methodical needs assessment is not just a procurement exercise – it is a strategic opportunity to reimagine how technology can transform your organization. The most successful technology investments are those that solve real problems, align with organizational goals, and empower your team to work more efficiently and creatively. Don’t fall into the trap of just moving what you are currently doing over to a new system. This is an ideal occasion to think about how you would design workflows and processes if you were to start from scratch and use that framework to evaluate the new capabilities available. You don’t want to duplicate what you are doing today; you want to step back and take the opportunity to build something better whenever possible. Customer Data Platform? Editorial Management System? Learning Management System? Something Else? Delta Think partners with publishers to do the foundational and implementation work required to ensure that technology decisions match the organization’s capabilities, fit the budget, and are grounded in voice-of-customer data. Our processes, including stakeholder interviews, surveys, and workshops, combined with expert landscape research, analysis, and assessments, underpin technology decision-making that is market-focused and customer-driven. If your 2025 objectives depend on or are enabled by technology, we’d welcome the opportunity to help you learn, plan, achieve. Please contact us today to start the conversation.
By Dan Pollock and Heather Staines January 14, 2025
This month’s topic: How reliable are the headlines you read in reports? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including NISO Plus (Feb 10-12) and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How reliable are the headlines you read in reports? O verview A number of sources provide information about patterns in the overall scholarly journals market. However, as we so often mention in our analyses, important nuances lie beneath the headlines. This month we explore just how much variation exists and highlight the importance of specificity. Background As part of our annual market updates, we estimate the proportions of open vs. subscription access content each year. Over the last few years, we have observed how OA has approached 50% of output, but we note that it has yet to punch through that number. However, this headline varies greatly depending on your area of publishing. An example from physics The chart below shows the nuances across just a few of the 200+ subjects that we track.
By Dan Pollock, Ann Michael December 10, 2024
This month’s topic: How much content can AI legally exploit? Scroll down to read about this topic, along with the latest headlines and announcements. Delta Think publishes this News & Views mailing in conjunction with its Data & Analytics Tool . Please forward News & Views to colleagues and friends, who can register to receive News & Views for free each month. Delta Think will be attending several upcoming conferences, including APE (Jan 14-15), NISO Plus (Feb 10-12), and Researcher to Reader (Feb 20-21). We would love to see you there – please get in touch or visit our Events page to see all the meetings we will be attending. How much content can AI legally exploit? O verview During the recent PubsTech conference , we were asked how much content could be legitimately used to train artificial intelligence systems without being specifically secured through a licensing agreement. In considering this question, we find some counterintuitive results. Background Generative AI (genAI) is a type of artificial intelligence that can create new content—text, images, music, and more – by analyzing patterns in massive datasets. These models are typically trained on publicly available data scraped from the web. In the US, developers often invoke the “Fair Use” copyright doctrine to justify this training, claiming it is limited to specific purposes (training) and transformative in nature (different from the original use). In reality, the legal position is complex and evolving , with many rights holders and their representatives – unsurprisingly – taking the opposite view. Even if legal clarity emerges, different geographies and jurisdictions will likely reach different conclusions. The legal complexities of AI and copyright law are beyond our scope. However, for scholarly publishers, particular issues apply. Half of our output is open access , and open access content is designed to be reusable. Open or not, content has varying restrictions on onward use – for example, non-commercial use is often allowed with attribution. How much scholarly content is exploitable?  For the purposes of analysis, we will assume that the license under which content is published will have a material bearing on the legitimacy of its use to train AI systems. Therefore, looking at share of licenses, we might be able to answer our question.
A blue hot air balloon is flying in the night sky.
By Lori Carlin December 6, 2024
Welcome to the next issue of Delta Think's Ideas in Action - ideas that spark your imagination and encourage creativity...information that makes you stop and THINK! Want to know more about partnering with Delta Think? Contact Delta Think at info@deltathink.com to set up a time to meet and learn more. Charleston Conference 2024 Reflections November always marks several noteworthy activities and events both personally and professionally, including one of our favorites – the Charleston Conference – where stakeholders from all areas of our industry – librarians, service providers, and publishers alike, get the opportunity to debate, collaborate, and share insights. Richard Charkin, OBE, described the Conference this way in his 2024 opening keynote remarks: “This meeting is incredibly important. Serious people debating serious issues.” We agree and add that the spirit of Charleston is also grounded in engagement – with colleagues and friends and making time for a bit of fun. Karaoke optional! Whether you were able to attend or not, here are some reflections on the 2024 Conference from the Delta Think Team. Libraries as Leaders – Lori Carlin The first thing that hit me was the energy of the conference overall; it was invigorating. Walking into the exhibit area on Vendor Day, you could sense a heightened level of interest from attendees eager to see and hear about new and interesting developments. Is it AI that is fostering this renewed energy? AI is certainly a hot topic, as stakeholders wonder how to best incorporate AI into their products, services, and workflows. Or perhaps the spotlight on Research Integrity and the various products that can help the scholarly community address these issues. Whatever the reason, I have always appreciated Charleston’s approach to exhibits, with a single dedicated day for vendors to showcase their wares, and the packed ballroom left no doubt that this concentrated attendee/vendor time was appreciated by all. As for sessions, the Opening Keynote featuring Katina Strauch and Richard Charkin was interesting – both bringing their own sense of wit to their description of their different but equally circuitous paths to scholarly publishing and their eventual role as community leaders. I also have to call out a session I moderated – “Keeping Libraries as Central Players in an Evolving Teaching and Learning Space,” and not because I moderated it! It was the librarian panelists as well as the interaction from the audience that made this session lively and interesting. What it reinforced for me is the leadership role librarians now play as not only information resource agents and gatekeepers in their communities, but data analysts, policy drivers, and educators, ensuring that advancements in teaching and learning are recognized and implemented. Books and eBooks in the Spotlight – Diane Harnish There was a noticeable “buzz” at Charleston around eBooks and book-based content. Whether for teaching and learning or research usage occasions, the value of book collections, or exploration of evolving funding models and roles, books were top-of-mind for librarians and publishers. For example, “Whose Future Is It? Practical Strategies for Supporting Community-led Open Access Book Publishing” focused on how libraries can take a leadership role in open access book publishing. The concurrent session was full of practical insights into how libraries develop effective strategies to support community-led and academy-owned OA book publishing, with an emphasis on equity. On a more macro-scale, Niels Stern, Managing Director, DOAB & OAPEN Foundation led a Neapolitan discussion entitled “Open Access Policies for Books: Librarian Roles in Nudging Institutional and National Change” which explored the work of the recently concluded PALOMERA Project, an initiative to examine and analyze the research policies and strategies for open-access books in 39 countries in the European research area. The project generated evidenced-based, actionable recommendations to “help ensure that books don't get ‘left behind’” in a global move toward open research. I found this session ideal for any stakeholder – library, funder, or publisher – interested in ensuring sustainable infrastructure for eBook, especially scholarly monographs. After more than 30 years in scholarly communication, this was my first Charleston and I will definitely be back! Research Integrity + AI and Copyright – Heather Staines Working closely with Dr. Elisabeth Bik and Dr. Ivan Oransky to explore research integrity issues was timely and enlightening. While there are many new tools to detect misconduct, both agreed that focusing on the human factor will be key—seeking change in research assessment and the kinds of publications that count. Their Neapolitan, “Challenges and Opportunities Around Research Integrity: A Conversation” session provided an informative overview of some of the most biggest challenges to research integrity (image manipulation, paper mills) and how Retraction Watch, COPE Guidelines, and other tools can be used by all stakeholders to raise awareness and help ensure the integrity of the scientific record. The other session which kept my interest was the “Long Arm of the Law” moderated by Ann Okerson. Copyright Clearance Center’s Roy Kaufman helped scope out the legal issues related to AI companies using copyrighted content to train their LLMs and shed some light on cases related to copyright and LLM training currently winding their way through the courts. ITHAKA’s Nancy Kopans followed JSTOR’s perspective as an aggregator working to balance the rights of copyright holders and publishers with the needs of students, faculty, and researchers. Definitely an area to watch! Katina’s Legacy – Meg White Charleston founder and convener Katina Strauch has passed the torch, but her legacy is a reminder that there is always more to discover, learn, and tackle. She never slows down and in many ways, defines what it means to always be evolving, embodying a true growth mindset. Katina and Richard Charkin kicked off the conference with a “Fireside Chat” Keynote moderated by Richard Gallagher, President and Editor-in-Chief of Annual Reviews (and the new owner of the Charleston Hub). As Lori mentioned, these two trailblazers were meeting for the first time, but they reflected on shared pivotal moments in their professional lives, including the intersection of publishing and librarianship, as we have moved from the internet to digitization of content and collections, and now to AI. I had the pleasure of interviewing Katina as part of the Charleston Leadership Interviews and the ATG Podcast, so watch for that conversation coming soon at the Charleston Hub. Her passion certainly informs many of the key values we strive for here at Delta Think as we work with the scholarly communications community to LEARN, PLAN, ACHIEVE. Bravo! Finally, we offer our congratulations to writer, director, producer, and star Heather Staines and her merry band of players. Thank you for an entertaining look at libraries, publishing, education, research, academia, and more in “Schmetadata: The Musical” a light-hearted start to the Conference’s final day. Next Steps What were your “aha moments” at Charleston 2024? What are your organization’s biggest priorities and challenges for 2025 and beyond? At Delta Think, we believe in the power of collaboration and innovation to drive progress. We can help you embrace change and unlock your potential. Reach out today to start the conversation and we look forward to hearing more. More Ideas News & Views: Market Sizing Update 2024: Has OA Hit A Peak? (Oct 2024) –Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year… ( read more ) Content Licensing Do’s and Don’ts in the Age of AI (Oct 2024) – Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used… ( read more ) Exploring AI (Sept 2024) – AI technologies have already sparked profound changes across our industry, enabling machines to perform tasks that previously required an abundance of human intelligence. AI algorithms can analyze vast datasets to uncover patterns, LLMs can generate coherent text, and genAI can simulate human-like creativity. Here we explore some of… ( read more ) Events We’ll be attending the following events. Please contact us at info@deltathink.com if you’d like to set up a time to chat. APE, January 14-15 Researcher to Reader, February 20-21 ER&L, March 3-6 London Book Fair, March 11-13 2025 NAS Journal Summit, March 19-20 Turn Your Ideas Into Action A partnership with Delta Think can provide the expert insights you need to meet your goals and amplify your ability to: Learn about new and evolving insights, perspectives, and possibilities Market Research and Intelligence Customer Insight and Experience Data Analytics and Market Evidence Plan your path forward to success Business and Product Strategy Commercial Optimization Brand, Marketing, and CDP Strategies Achieve your goals Manage Change Implement Projects, Products, and Partnerships Build Results Metrics and Analysis O ur insatiable curiosity, coupled with our expertise in data-driven, evidence-based analysis, and strategy development – TOGETHER – we will discover your best path forward. Want to know more? Schedule a call today or visit deltathink.com
By Heather Staines October 31, 2024
We are proud to share a video recording of our October News & Views companion online discussion forum! Join us for our annual update of the market size and revenue share of Open Access and a lively conversation around the trends and the wider issues that may be informing the overall market in scholarly communications.  If you missed the session, or if you attended and would like to watch/listen again, or share forward with friends, please feel free!
A mountain range with snow on the peaks and clouds in the sky
By Dan Pollock, Ann Michael October 22, 2024
Overview Each year, Delta Think’s Market Sizing analyzes the value of the open access (OA) scholarly journals market. This is the revenue generated by providers or the costs incurred by buyers of content. We estimate the OA segment of the market to have grown to just over $2.2bn in 2023. This is only a marginal growth over the previous year. It is a small fraction of the long-term historical growth of the OA segment. A reduction in the output of the large OA-only publishers has had a profound effect on the market. It has benefited established publishers, who are seeing a growth in OA, even while the overall market softens. We expect this pattern to continue in 2024. Have we reached peak open access? Have the underlying drivers of OA changed? And are we now in an era of lower OA growth? Headline findings Our models suggest the following headlines for open access market sizing:
A clipboard with the words do 's and don 'ts written on it
By Lori Carlin October 21, 2024
Artificial Intelligence’s (AI) seemingly endless capabilities and applications present great opportunities (and some challenges too) for publishers and societies across the publishing enterprise. One of the main areas of both growth and reason for caution to emerge is the potential to license scholarly content to AI providers—primarily to be used to “train” large language models (LLMs). While this type of licensing opportunity may be compelling, it requires thoughtful integration into the organization’s overall content portfolio management and revenue strategy. Recently announced licensing agreements between scholarly and academic publishers and technology companies highlight AI’s insatiable demand for primary, verified, reliable information. AI developers rely on this high-quality, vetted content to train models, refine algorithms, and enhance natural language processing capabilities. This demand can present a lucrative opportunity for publishers to license content – aka the knowledge needed for training. It also raises important strategic questions about ownership, sustainability, and long-term business models that should not be ignored in the process. Opportunity vs. Risk: Licensing Content Do’s and Don’ts If a partnership with an AI company seems intriguing, it is…as long as you proceed with an understanding of how this opportunity may play out for your organization and where on the classic innovation adoption curve you are comfortable. Here is a handy checklist to help you evaluate the opportunities and risks of licensing content to AI providers. Keep in mind, YMMV, as will your priorities. Do: Integrate Licensing into Overall Content Strategy – View AI licensing as part of a broader content portfolio management plan to align with business objectives and sustain long-term value. Prioritize Content Based on Value – Categorize content by demand and monetization potential to tailor licensing strategies for different segments (e.g., niche vs. broad appeal). Introduce Strategic Pricing Models – Experiment with flexible pricing strategies like volume-based, usage-based, or hybrid models to reflect content value and accommodate AI providers’ diverse needs. Complement and Enhance Existing Revenue Streams – Ensure that AI licensing supports rather than undermines other revenue channels (subscriptions, APCs, institutional licensing, etc.). Consider tiered access or differentiated pricing for recent vs. older content. Collaborate with AI Companies Ethically – Build partnerships that ensure responsible content usage. Establish guidelines for ethical AI content generation, labeling, and attribution. Protect Author Rights – Ensure that licensing agreements comply with existing contracts and protect authors’ rights. Proactively manage relationships with scholars to maintain trust and uphold their interests. Be Prepared for Market Shifts – Experimentation is the order of the day but the market and innovation is moving fast. Adopt flexible frameworks to quickly adjust to technological changes or shifts in demand for licensed content. Maintain Transparency and Communication – Keep authors, research communities, and internal stakeholders informed about how the organization’s content is licensed and used by AI firms. Consider Partnering with Other Content Providers – Strategically partner with publishing peers to offer a broader range of niche content. Collectively negotiate through a ‘power in numbers’ approach. Don’t: Rely Solely on AI-Driven Revenue – Avoid becoming over-reliant on revenue from AI licensing, as market shifts could jeopardize financial stability if demand for licensed content declines. Undermine Content Value – Be cautious of pricing models that risk devaluing content over time, especially as AI-generated content becomes more sophisticated. Ignore Unintended Consequences – Don’t overlook the potential for content devaluation or the blurring of lines between original research and AI-generated outputs. Neglect Author Concerns – Don’t disregard the potential for author questions, dissatisfaction, or misuse of their work. Always respect contractual obligations and maintain productive relationships with the academic community. Overlook Ethical Concerns – Avoid participating in licensing agreements without ensuring ethical guidelines for the use of AI-generated content, including issues like data privacy and security. Ignore the Long-Term Impact on Scholarly Publishing – Don’t assume AI-driven licensing won’t affect traditional publication models. Proactively assess how AI might impact and change peer review, publication demand, and researcher incentives. Final Thoughts Licensing content to AI providers is certainly a potential opportunity for publishers. That opportunity also comes with possible risks and the need for some caution. These Do’s and Don’ts serve as a starting point to help you begin to frame out how partnerships with AI providers may or may not “fit” with your strategy, mission, and organizational goals, while acknowledging the need to consider safeguards to protect the integrity of your content, author relationships, and long-term sustainability. Delta Think can help your organization understand the unique opportunities and challenges of integrating AI licensing into a comprehensive content portfolio management strategy. Ready to start the conversation? Contact us today. As Ideas in Action went to press, Ithaka S&R announced a Generative AI Licensing Agreement Tracker to help capture the details, impact, and strategy of these deals.
A sign that says market sizing coming soon on it
By Dan Pollock and Heather Staines September 18, 2024
In July, we shared a sneak peek at the 2023 market size, based on our annual publisher survey, and we’re currently heads down finalizing our analysis of the trends, along the corresponding revenue for both fully OA and hybrid content. Look for this important update News & Views in mid-October. We’ll also hold our annual free webinar… Read More The post News & Views: Register now for Delta Think’s 2024 Market Sizing Update Webinar appeared first on Delta Think.
More Posts
Share by: