Formerly known as Wikibon
Search
Close this search box.

Red Hat Summit and AnsibleFest 2024: AI’s Linux Moment

The 2024 Red Hat Summit and AnsibleFest in Denver highlighted the pivotal role of open-source AI in accelerating organizational transformation and innovation. The event emphasized the transformative potential of open-source AI infrastructure, showcasing significant announcements, one being the open sourcing of IBM’s Granite Models. Red Hat’s commitment to an open partnership model was evident, aiming to democratize AI through collaborations that tailor solutions to customer needs. Other key announcements included the RHEL Image Mode and RHEL AI, which focused on enhancing the deployment and management of AI applications across hybrid cloud environments. The event also marked the launch of InstructLab, a community-driven project for creating synthetic data and fine-tunig large language models (LLMs) in an open-source framework. Additionally, OpenShift AI’s evolution was showcased with new capabilities for model deployment and management, focusing on enhanced developer productivity and operational efficiency at scale. Overall, the event underlined the critical role of community and open-source frameworks in advancing AI technologies and strategies for future developments.


Focus on Open-Source with AI at the Forefront

The central theme throughout the event was how to get further, faster with open and built-for-AI infrastructure. It emphasized the transformative potential of open-source AI, from models to tooling, in driving organizational change and modernization. Unlike proprietary AI solutions, open-source AI provides greater flexibility and collaboration opportunities, fostering a robust ecosystem of partners across various stages of implementation.  One of the largest announcements was that with IBM around the open sourcing, under Apache license, for the IBM Granite Models and Code-Assistant.

Commitment to Open Partnership Model

Red Hat’s commitment to an open partnership model was another key takeaway. This approach involves engaging multiple partners to create comprehensive solutions tailored to customer needs. By offering a wide range of choices, Red Hat aims to democratize AI and facilitate enterprise-level model tuning and on-premises implementations. According to Red Hat, it is “Customer First, Partner Always”

Red Hat Enterprise Linux (RHEL) Image Mode, RHEL AI, InstructLab, and Podman AI Lab

Red Hat introduced “image mode” for Red Hat Enterprise Linux, a novel deployment strategy that provides the operating system as a container image, aligning with contemporary application development practices and using OpenShift’s Kubernetes as the control plane. This approach aims to simplify the building, deploying, and managing of the operating system using container-native workflows, enabling seamless integration with DevOps practices like CI/CD and GitOps. Designed to enhance flexibility, scalability, and responsiveness, image mode addresses the dynamic needs of hybrid cloud environments and AI-driven applications. It supports organizations in managing their technology from data centers to the cloud, ensuring consistency and security across various platforms by providing a golden image where added packages can be put on top.  This gets organizations out of managing the OS and focusing on the layers above the OS where the organization’s applications provide value.  

RHEL AI goes beyond the image mode, which is available for standard RHEL, and is more like a virtual appliance for AI, including InstructLab, a new open-source project based on an IBM Research note on a new AI model alignment method called the Large-scale Alignment for chatBots (LAB), which leverages taxonomy-guided synthetic data generation and a multi-phase tuning framework to make AI more accessible and reduce reliance on costly human annotations. This technique allows models to be enhanced by attaching skills and knowledge to a taxonomy, then generating and using synthetic data at scale for training. Building on the success of LAB, IBM and Red Hat launched InstructLab. This open-source community aims to simplify the process of developing, building, and contributing to large language models (LLMs) using IBM’s open-source Granite models.  Its biggest advantage is that the end-user organization can train privately on their own data and create the synthetic data needed to scale and fine-tune open models they leverage, be it Granite or other open LLMs.  InstructLab has already integrated the Granite 7B English language model and plans to support Granite code models soon, allowing users to actively contribute to the models’ evolution much like any other open-source project.

Add in Podman AI Lab, where Red Hat aims to accelerate AI application development from the local laptop and make it easier for developers, security teams, and infrastructure managers to leverage the robustness and reliability of Red Hat Enterprise Linux across the hybrid cloud landscape.

Red Hat OpenShift and OpenShift AI

Red Hat OpenShift AI continues to evolve.  Previously called OpenShift Data Science, we saw the announcement os version 2.9.  This version of OpenShift AI introduces various features designed to enhance AI model deployment and management across hybrid deployments. Model serving at the edge, a technology preview feature, extends AI deployment to remote and resource-limited locations using single-node OpenShift, facilitating inferencing with limited connectivity. It supports multiple model servers, including KServe and TGIS, allowing simultaneous use of predictive and Generative AI for diverse applications, thereby reducing costs and streamlining operations. The platform also integrates distributed workload management using Ray, CodeFlare, and KubeRay to accelerate data processing and model training across multiple cluster nodes, optimizing resource allocation like GPU distribution. Additionally, OpenShift AI enhances model development with flexible project workspaces and workbench images supporting tools like VS Code and RStudio and offers improved observability through model monitoring visualizations. New accelerator profiles further tailor the environment for varied hardware needs, simplifying access to appropriate accelerators for specific tasks.

Enhancements in Developer and Operational Efficiency

It is important to lay out a clear roadmap for AI, understanding that it is a  journey rather than just the destination. Several announcements focused on improving developer productivity and securing long-term AI implementations. The introduction of policy as code for Ansible and the extension of Lightspeed to both RHEL and OpenShift were particularly noteworthy.

Migration and Modernization

There was also a lot of discussion about OpenShift Virtualization and how it is a viable option for some virtual machine workloads.  Based on the open source project KubeVirt, there was a lot of discussion around which VM workloads on VMWare today might transition to OpenShift Virtualization soon.

Key topics included migration to OpenShift Virt and the use of RHEL image mode for enhanced security. A great example of the latter was Salesforce’s move from CentOS to RHEL 9 which underscores Red Hat’s efforts to simplify transitions and improve operational efficiency.  It should be noted that there was no angst to be found around the licensing changes for CentOS, with most seeming to see CentOS Stream as sufficiently open.

Leveraging AI for Better Business Outcomes

The discussions with customers and partners revealed a common theme: leveraging AI to accelerate time to market and improve developer productivity. The cognitive load reduction for developers, as highlighted by the integration of Lightspeed and InstructLab, demonstrates Red Hat’s commitment to making AI more accessible and manageable.

Hands-On Learning with InstructLab

The enthusiasm for learning and hands-on experience was palpable, with long lines at the InstructLab sessions. This eagerness to engage with the technology and understand its practical applications reflects the community’s commitment to innovation.

Open Source and Community Support

The emphasis on community-driven projects like konveyor and the support for open-source initiatives by Red Hat and IBM further underline the collaborative spirit of the event. By leveraging the community’s collective knowledge and resources, Red Hat aims to drive innovation and support the successful implementation of AI across various industries.

Security and AI

What is happening at the intersection of AI and security, as the complexity of securing AI, including the challenges of ensuring safety and preventing data leakage becomes a larger target. The importance of understanding the provenance of data used in AI models and the potential to use InstructLab to filter out undesirable data inputs has some exciting applications. Bu there is a need for a balanced approach to security, where not all vulnerabilities are treated equally, and a focus on significant threats over minor ones. As we address the cybersecurity skills gap, there needs to be a shift from a compliance-focused security mindset to one more aligned with current technological and threat landscapes. We found that the potential of AI to reduce the workload on security professionals, thereby allowing them to focus more on proactive security measures in the code and open source could be a huge role for AI.

Our Perspective

The 2024 Red Hat Summit and AnsibleFest showcased significant advancements and a clear vision for the future of open-source AI and infrastructure technology. The focus on open partnerships, developer and operational efficiency, and community engagement underscores Red Hat’s commitment to driving innovation and delivering better business outcomes, such as potentially lower costs and accelerating time to innovation with this community approach to AI.  IntructLab is a great example, going from IBM Research paper to MVP in about 10 weeks. As the pace of technological innovation accelerates, staying informed and leveraging the right partners will be crucial for organizations looking to navigate the evolving landscape.

Reflecting on the event, we cannot emphasize enough the importance of understanding that getting to AI is a journey, not a destination. What Gen AI is today, dominated by prompts and who has the biggest and fastest LLM, is not what it will be in 12 months. Understanding this will require continuous learning and adaptation of the AI models. With a consistent thread of innovation and practical implementation, Red Hat is well-positioned to support organizations in their AI and modernization efforts.

For those who missed the live sessions, theCUBE’s comprehensive coverage of the Red Hat Summit and AnsibleFest is available online, offering valuable insights and analysis for anyone looking to stay informed about the latest developments in the tech world.

Article Categories

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content