-
CNCF Executive Director Priyanka Sharma outlined six ways Kubernetes is enabling cloud workloads
-
Sharma's remarks were delivered at KubeCon in Chiacago last week
-
Flexibility, reliability and transparency were on her list
KubeCon + CloudNativeCon North America 2023, Chicago – At KubeCon North America, Cloud Native Computing Foundation (CNCF), executive director Priyanka Sharma said, "Kubernetes is having its Linux moment." She's right.
Kubernetes, like Linux before it, is changing everything about it. And, one place you can clearly see it is in how behind the scenes, Kubernetes and cloud-native computing programs are powering OpenAI, NVIDIA and Hugging Face generative AI applications.
Want to discuss AI workloads, automation and data center physical infrastructure challenges with us? Meet us in Sonoma, Calif., from Dec. 6-7 for our Cloud Executive Summit.
During a press conference, Sharma gave the six reasons why end-user companies such as Adobe, Bloomberg and CERN use cloud-native technologies to deliver the flexibility and scale AI workloads demand.
1. Optimized resource usage for AI workloads
Cloud-native applications are designed to use resources efficiently, which is crucial for AI. They ensure high performance while keeping costs in check. Tools like Kubeflow, now at the incubation level, are instrumental in providing machine learning pipelines and MLOps, further enhancing this efficiency.
2. Reliable and flexible technology stack
The cloud-native stack offers the freedom to deploy AI workloads anywhere in the network. This flexibility, coupled with the ability to scale workloads using event-driven CNCF projects like KEDA, makes it a robust choice for AI applications.
3. Agility in deployment and testing
Rapid deployment, testing, and iteration are key in AI development. Cloud-native practices support this agility, allowing for iterative testing and quick prototyping, which are often essential in AI projects.
4. Transparency and open source values
The cloud-native and open-source communities prioritize transparency. This approach is vital for discovering, adapting, and delivering AI workflows in a safe, ethical, and open manner. This, I should add, is more of a hope than a reality.
5. Lowered barriers and minimized risks
Open interfaces and standards in cloud-native technologies provide a vast ecosystem of tools and run times. This diversity helps practitioners avoid being tied to a single provider, platform or roadmap, thereby minimizing risks.
6. Rapid innovation
The cloud-native community is known for its quick adoption of new ideas and cutting-edge thinking. This leads to advanced ML/AI support and features within the cloud native ecosystem, fostering continuous innovation.
So, while ordinary users trying to get ChatGPT to do their term paper for them or get that blasted spreadsheet to work right this time will never recognize it, they, too, owe a debt to the cloud-native computing techniques behind generative AI apps.