The Linux Foundation Projects
Skip to main content
Sign up for our mailing list to stay up to date on OPEA!

Simplify development, production, & adoption of GenAI apps for enterprise

The OPEA platform includes:

  • Detailed framework of composable building blocks for state-of-the-art generative AI systems including LLMs, data stores, and prompt engines
  • Architectural blueprints of retrieval-augmented generative AI component stack structure and end-to-end workflows
  • A four-step assessment for grading generative AI systems around performance, features, trustworthiness and enterprise-grade readiness

Efficient
Harnesses existing infrastructure, the AI accelerator or other hardware of your choosing.

Seamless
Integrates with enterprise software, with heterogeneous support and stability across system & network.

Open
Brings together best of breed innovations and is free from proprietary vendor lock-in.

Ubiquitous
Runs everywhere through a flexible architecture built for cloud, data center, edge and PC.

Trusted
Features a secure enterprise-ready pipeline and tools for responsibility, transparency, and traceability.

Scalable
Access to a vibrant ecosystem of partners to help build and scale your solution.

We're thrilled to welcome OPEA to LF AI & Data with the promise to offer an open-source standardized modular and heterogenous RAG pipelines for enterprises with a focus on open model development, hardened and optimized support of various compilers and toolchains. OPEA will unlock new possibilities in AI by creating a detailed, composable framework that stands at the forefront of technology stacks. This initiative is a testament to our mission to drive open-source innovation and collaboration within the AI and data communities under a neutral and open governance model.

Ibrahim HaddadExecutive Director, LF AI & Data

With the potential that generative AI has to shape our future and the ways we do business, it’s imperative that we tap into the power of collaboration for innovation and accuracy in enterprise AI. We're excited to be a part of the newest AI & Data Sandbox project and to work with other industry leaders on OPEA.

Shadi ShahinVP, Product Strategy, SAS

Enterprises integrating their choice of cutting-edge tools into their AI platforms aren’t just ahead of the AI curve—they’re defining it. OPEA's vision aligns with our commitment to open, flexible, governed AI innovation, and we're proud to support it alongside Intel.

Thomas RobinsonCOO, Domino Data Lab

Hugging Face's mission is to democratize good machine learning and maximize its positive impact across industries and society. By joining OPEA's open-source consortium to accelerate Generative AI value to enterprise, we will be able to continue advancing open models and simplify GenAI adoption."

Julien SimonChief Evangelist, Hugging Face

Intel is at the forefront of incubating open source development to build trusted, scalable open infrastructure that enables heterogeneity and provides a platform for developer innovation. Generative AI is at this moment; OPEA, with the support of the broader community, will address critical pain points of RAG adoption and scale today. It will also define a platform for the next phases of developer innovation that harnesses the potential value generative AI can bring to enterprises and all our lives.”

Melissa EversVP of Software and Advanced Technology Group and GM of Strategy to Execution, Intel Corporation

"At DataStax, we help enterprises succeed with AI and RAG. We love to partner and collaborate with other open community companies and organizations like Intel and the Linux Foundation to develop standards to help the industry move forward. As we expand our portfolio to ease RAG Application development, we see the Open Platform for Enterprise AI (OPEA) as a foundation that will drive modularity, scale and hardening to provide a platform for future innovation."

Davor BonaciCTO, DataStax

At dstack, we're building a new approach to AI infrastructure management aimed at leveraging open source and ensuring its portability across multiple infrastructure and model vendors. We believe the mission of the OPEA initiative is crucial for the safety and democratization of enterprise AI. We're excited to be a part of it.

Andrey CheptsovCEO & Founder, dstack

Collaborative, open-source projects like OPEA fuel our excitement for the future of gen AI because of the ability it has to drive acceleration of both innovation and adoption within enterprise organizations. The power of RAG is undeniable, and its integration into gen AI creates a ballast of truth that enables businesses to confidently tap into their data and use it to grow their business.

Michael GilfixChief Product and Engineering Officer, KX

As GenAI matures, integration into existing IT is a natural and necessary step. The world needs GenAI and vectors as part of a general purpose RDBMS, and we have already demonstrated our ability to deliver this through MariaDB Server. We see huge opportunities for core MariaDB users - and users of the related MySQL Server – to build RAG solutions. It's logical to keep the source data, the AI vector data, and the output data in one and the same RDBMS. The OPEA community, as part of LF AI & Data, is an obvious entity to simplify Enterprise GenAI adoption.

Kaj ArnöCEO, MariaDB Foundation

The OPEA initiative is crucial for the future of AI development. Advocating for a foundation of open source and standards - from datasets to formats to APIs and models, enables organizations and enterprises to build transparently. The AI data infrastructure must also be built on these open principles. Only by having open source and open standard solutions, from models to infrastructure and down to the data are we able to create trust, ensure transparency and promote accountability.

AB PeriasamyCEO and co-Founder, MinIO

As the leading open-source vector database technology provider, Qdrant is excited to support the launch of the OPEA, underscoring the importance of open standards in AI for innovation and data sovereignty. Our commitment to these principles is rooted in our core, and we look forward to contributing to an ecosystem where AI thrives with a deep respect for data ownership.

Andre ZayarniCEO & co-Founder, Qdrant

As gen AI continues to advance, open source is playing a critical role in the standardization and democratization of models, frameworks, platforms and the tools needed to help enterprises realize value from AI. Red Hat is excited about the potential for AI innovation for our customers through the Open Platform for Enterprise AI.

Steven HuelsVP and GM, AI Business Unit, Red Hat

We are pleased to collaborate with the Open Platform for Enterprise AI (OPEA), which offers essential guidance in a dense and complex market. Within OPEA, Yellowbrick serves as a data provider—recognizing data as the crucial fuel for AI. Our data warehouse incorporates advanced vector capabilities, enabling seamless integration of AI with current systems and workflows. This ensures that AI augments rather than interrupts business processes, simplifying AI adoption.

Mark, CusackCTO, Yellowbrick Data

Cloudera is thrilled to join industry thought leaders like Intel in the Open Platform for Enterprise AI Alliance, embracing openness and collaboration to drive innovation and empower the future of generative AI.”

Andy MollerSVP of Global Alliances & Ecosystem, Cloudera

We firmly believe that vector databases are integral to the future of open generative AI, which is why we donated the Milvus vector database to the Linux Foundation back in 2020. Our support of OPEA is an extension of that commitment to creating a framework alongside Intel that fosters extensible, accessible, and scalable AI platforms for enterprise developers.

Charles XieCEO & Founder, Zilliz

We are seeing tremendous enthusiasm among our customer base for RAG, with organizations deploying RAG applications on-premises to empower employees and customers to find the information they need faster, creating greater efficiencies in customer service and document search. The constructs behind RAG can be universally applied to a variety of use cases, making a community-driven approach that drives consistency and interoperability for RAG applications an important step forward in helping all organizations to safely embrace the many benefits that AI has to offer.

Chris WolfGlobal Head of AI and Advanced Services, Broadcom

As pioneers of RAG-based systems, we are happy to see OPEA driving the adoption. RAG architectures are established and proven for building GenAI apps and OPEAs efforts will help to drive the standard-creation in RAG forward, thus accelerating enterprise adoption. We are proud to be part of this initiative with the Haystack-framework.

Milos RusicCEO & Founder, deepset

We live in a dynamic and exciting time where new AI technologies are turning up literally every hour. OPEA takes on two important problems: helping enterprises better understand the GenAI landscape through open architecture patterns; and accelerating quality, value, and time to market with tools and best practices. Neo4j is proud to be a member, and look forward to contributing knowledge, open software, and best practices in the field of knowledge graphs for GenAI & GraphRAG.

Philip RathleCTO, Neo4j

Generative AI is revolutionizing how companies embrace technology and data. The OPEA initiative is a much-needed step to define solid architectural patterns, use cases and validation techniques to streamline organisations' adoption of scalable, secure and trustworthy Generative AI. As a member, we believe the OPEA initiative is important in enabling organizations to share innovative best practices and architectures to empower companies to embrace Generative AI.

Francesco TisiotField CTO, Aiven

"OPEA is a game-changer for businesses looking to integrate generative AI workflows securely, efficiently, and at scale. At Prediction Guard, we believe in the power of open platforms to accelerate innovation and drive business value. By leveraging OPEA's composable building blocks and industry-standard best practices, our customers can build cutting-edge generative AI solutions faster than ever before. Our involvement with OPEA underscores our commitment to fostering an open, collaborative ecosystem that empowers businesses to realize the full potential of AI."

Daniel WhitenackFounder and CEO, Prediction Guard

"Wipro is thrilled to collaborate and contribute to Open Platform for Enterprise AI (OPEA) to harness the power of accelerating AI and build cutting-edge RAG applications. Wipro's vision of delivering vendor agnostic and composable AI solutions has a great synergy with OPEA. Through this synergy, we are committed to empowering enterprises with agile, scalable, and cost-effective AI solutions, paving the way for transformative business outcomes. This collaboration will help us fostering innovation and accelerating the development of Gen AI driven IT use cases that address the broad-spectrum needs of industry verticals."

Mayur ShahGeneral Manager and Global Head, DC & Hybrid Cloud - Practice Development, Eng, Wipro

At Articul8, we are building an Autonomous GenAI platform that empowers enterprises to build their domain-specific expert-level GenAI applications. General purpose LLMs are necessary but not sufficient for building expert-level enterprise applications. Multiple models including domain-specific models need to work together autonomously, and in a safe and secure environment. We are thrilled to part of the OPEA ecosystem to foster collaborative and secure deployment of GenAI capabilities on Customer’s infrastructure of choice, eliminating vendor lock-in. Being part of the OPEA ecosystem allows us to innovate with like-minded partners to accelerate our mission.

Arun Karthi SubramaniyanCEO, Articul8.ai

"Corsha is pleased to join the Open Platform for Enterprise AI Alliance (OPEA) in advancing the integration and adoption of generative AI technologies. As AI evolves and becomes integral to business operations, it is essential to embed robust security and privacy measures at every step. At Corsha, we're committed to contributing our expertise in zero-trust architecture and non-human identity to ensure that enterprises can adopt generative AI confidently, knowing their data and workflows are protected."

Anusha IyerFounder/CEO, Corsha

"Infosys is thrilled to join the OPEA community. With the explosion of GenAI models, techniques like RAG, agent frameworks, and multi-modal use cases becoming mainstream, it’s crucial to build architectural blueprints, responsible AI guardrails, and tooling. Our investments in Infosys Topaz AI offerings are a step in this direction. As advocates of open source, Infosys will actively contribute to the OPEA project, making it available across platforms. We look forward to sharing our expertise, learning from peers, and shaping the future of GenAI."

Rafee TarafdarChief Technology Officer, Infosys

"AMD is thrilled to join OPEA, aligning with our commitment to open-source innovation. We’re enabling enterprises to harness AI on AMD platforms from edge to cloud, driving business transformation with performance, energy efficiency, and optimized TCO."

Ramine RoaneCorporate Vice President of AI Product Management, AMD

We are thrilled to join forces with the OPEA community, reinforcing our commitment to advancing AI and data solutions on a global scale. By engaging with passionate communities and industries, we can drive innovation and simplify AI adoption across industries. This partnership will empower developers and help shape AI's future, ensuring trust and progress in AI-driven enterprises. Together, we will unlock new opportunities for AI-enhanced solutions in the ever-evolving digital landscape.

Xin ZhangVP of Volcano Engine, ByteDance

“We are pleased to work with the Open Platform for Enterprise AI (OPEA), which simplifies enterprise-generated AI solutions, accelerates enterprise innovation capabilities and drives business value realization. BONC Technologies and OPEA will work together to foster an open, collaborative and sustainable ecosystem platform to fully unleash the potential of artificial intelligence and enable enterprise digital transformation.”

Wang HuVice President of BONC Technology

"Canonical is focused on security, developer productivity, and enterprise operations of open source. We are delighted to join the OPEA project and address production, compliance and assurance requirements. With Canonical’s Everything LTS, we provide a consistent and long term supported AI stack that meets enterprise needs from development through deployment.”

Mark ShuttleworthCEO at Canonical

"Rivos looks forward to bringing its industry-leading power efficient, high performance, secure server solutions into the enterprise-focused OPEA project. By working as a community, we can speed up the development of business impactful GenAI workflows, maximising the return on AI investment."

Belli KuttannaCo-Founder and CTO at Rivos Inc.

“We at MongoDB are thrilled to be part of the OPEA community. MongoDB's powerful document database platform securely unifies operational, unstructured, and AI-related data to streamline building AI-enriched applications. We look forward to collaborating with OPEA to drive innovation and provide developers with the tools they need to succeed."

Gregory MaxsonHead of AI & Technology Partnerships, MongoDB

"AI is changing everything, and the foundation it’s built on must be secure. At Fr0ntierX, we believe in AI that people can trust and have built commercial versions of RAGs in Trusted Execution Environments. Joining OPEA gives us the opportunity to bring our expertise to a community that shares our passion for pushing boundaries with what's possible, without compromising integrity."

Jens AlbersCTO and Founder, Fr0ntierX

"At Zensar, we realized tremendous collaboration would be required to drive successful enterprise AI solutions. We are delighted to be part of the OPEA consortium to help drive this thinking forward collectively with leading companies to support this goal. We have approached the complexity of the landscape through a lens of “leading with the experience and leaping with intelligence. In combination with the open, adaptable, and scalable vision of OPEA, we believe this will be a winning strategy for all of us to succeed and help achieve our collective goals of delivering impactful Enterprise AI Solutions."

Tim BurkeVP, Advanced Engineering, Zensar

The promise of AI is truly infinite but before the full promise is realized, there is always a process of trial and error. This is where OPEA.dev is an extraordinary resource. By providing enterprise-ready blueprints and accelerators we can all achieve the promise of AI benefits faster and more safely. We at ArangoDB are excited to be a part of this ecosystem and believe Graph-Powered GenAI can be a game changer.

Shekhar IyerCEO, ArangoDB

“Enterprise AI deployment not only requires technological innovation but also holistic transformation in mindset and organizational capabilities. We are witnessing three crucial trends in enterprise AI practices: Companies are establishing comprehensive AI infrastructure that encompasses data, computing, and applications; they are also building up AI productivity while ensuring data security; and AI applications in the end truly serve business innovation, providing enterprises with sustainable competitive advantages. The OPEA project works towards solving these changes and we are proud to be a part of this community.”

Fei LiVP of New H3C Group & Dean of the New H3C Artificial Intelligence Research Institute, H3C

"The future of AI and ML depends on our ability to process workloads efficiently and securely wherever data is created. At Expanso, we’ve seen firsthand how distributed computing can transform operations by bringing compute to the data, rather than the other way around. That’s why we’re excited to join OPEA - this initiative aligns perfectly with our vision of open, secure, and efficient enterprise computing. By working together, we can create standardized frameworks that make AI deployment more accessible, reliable, and cost-effective for organizations of all sizes.”

David AronchickCEO & Co-Founder, Expanso

"Given Plum AI's developer tool focus of improving the quality of LLM applications, we're excited to join OPEA in their mission to make GenAI easier, faster, and more successful for everyone."

Julian NortonCEO & Co-Founder, Plum AI

Contribute to OPEA

We invite like-minded industry peers to contribute to the development and standardization of enterprise-grade Retrieval Augmented Generative AI.

Contribute on GitHubContact Us

Stay Connected

Keep up to date with the latest news and initiatives from OPEA.