IBM contributes key open-source projects to Linux Foundation to advance AI community participation

Analytics & Cognitive News
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

IBM is contributing 3 open-source projects—Docling, Data Prep Kit and BeeAI—to the Linux Foundation. This move signals not only the potential growth of these projects but also IBM’s ongoing commitment to open-source AI.

“We're continuing our long history of contributing open-source projects to ensure that they're easy to consume and that it's easy for others—not just us—to contribute,” says Brad Topol, IBM Distinguished Engineer and Director of Open Technologies, in an interview. Topol also chairs the Governing Board of the LF AI & Data Foundation, a group hosted under the Linux Foundation focused on advancing open-source innovation across artificial intelligence and data technologies.

Each project is focused on an essential part of the AI development stack. As the industry matures, innovation driven by the broader developer community in these areas is key to making AI enterprise ready.

Docling, which launched and open-sourced a year ago, addresses a limit that many foundation models have for enterprise use. While the models have been trained on every scrap of publicly available information, much of the data valuable to businesses lies in documents that are not accessible online: PDFs, annual reports, slide decks.

Docling streamlines the process of turning unstructured documents into JSON and Markdown files that are easy for large language models (LLMs) and other foundation models to digest.

Since its release, Docling has gained traction, earning more than 23,000 stars on GitHub. When combined with retrieval-augmented generation (RAG) techniques, Docling improves LLM outputs. “Docling can make the LLMs answer much better and much more specific to their needs,” says Topol. In addition to gaining traction in the open-source community, Docling helps power Red Hat® Enterprise Linux® AI, where it enables context aware chunking and supports the platform’s new data ingestion pipeline.

Of course, another critical step in deploying AI is data preparation. IBM’s Data Prep Kit, which was released in 2024, has also gained popularity: it helps clean, transform and enrich unstructured data for pre-training, fine-tuning and RAG use cases.

Unstructured data—such as databases, web pages and audio files which are more complex to parse and extract insights—accounts for 90% of all enterprise-generated data, according to IDC. LLMs can analyze vast amounts of unstructured data and extract relevant insights to generate and test new product or service ideas, for instance, in hours rather than months.

Data Prep Kit is designed to simplify data prep for LLM applications—currently focused on code and language models—supporting pre-training, fine-tuning and RAG use cases. Built on familiar distributed processing frameworks like Spark and Ray, it gives developers the flexibility to create custom modules that scale easily, whether running on a laptop or across an entire data center.

“We used to say, garbage in, garbage out. You definitely want good data going in,” Topol says. “This is not a glamorous project compared to some of the other parts of the LLM life cycle, but it’s incredibly critical, incredibly valuable and a definite must-have.” Data Prep Kit is beginning to power IBM offerings and is now in IBM’s TechPreview of IBM Data Integration for Unstructured Data.

Finally, as agents are gaining traction, IBM released BeeAI. BeeAI can be used by developers to discover, run and compose AI agents from any framework, including CrewAI, LangGraph, and AutoGen. The project includes the Agent Communication Protocol, which powers agent discoverability and interoperability, and the BeeAI-framework, its native framework for building agents in Python or TypeScript, optimized for open source models.

“There are other frameworks for building agents,” says Topol. “But what's nice about BeeAI is that it provides a platform where you can also plug in agents from those other technologies. BeeAI doesn't just work with its own agents.”

By contributing these projects to the Linux Foundation, IBM aims to expand their reach and attract new contributors and users. “The projects are in a wonderful spot where people can invest their resources. It makes a huge difference,” says Topol. “It's like an insurance policy. The open governance also makes people feel better that if they contribute, over time, they’re going to earn their stripes through what we call meritocracy and earn a more influential role in the project. They can also feel secure that the project won't make any drastic open-source license changes that could dramatically impede future use of the project.”

Pointing to Kubernetes—an open-source container orchestration system originally developed by Google and later donated to the Cloud Native Computing Foundation—Topol notes how its adoption surged after becoming part of an open governance model, ultimately turning it into an industry standard.

He has bold ambitions for these projects.

“An open-source project with a powerful ecosystem is, frankly, unstoppable,” he says.

IBM is a leading global hybrid cloud and AI, and business services provider, helping clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Nearly 3,000 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently, and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBM's legendary commitment to trust, transparency, responsibility, inclusivity, and service.

For more information, visit: www.ibm.com.

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  •  

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: