DDN®, the global leader in artificial intelligence (AI) and multi-cloud data management solutions, today announced DDN Infinia – a next-generation software-defined storage platform which leverages two decades of DDN engineering in file system, data orchestration and AI-based optimization, all coming together to usher the era of accelerated computing and generative AI.
DDN Infinia combines multi-tenancy at scale, containerization and the highest levels of speed and efficiency customers have come to expect from DDN systems, with ease of management and powerful security attributes. The novel architecture accelerates and simplifies workflows for the data management demands of today and tomorrow, from generative AI and large language models (LLM) to versatile and complex movements of workflows from the edge to data centers and the cloud.
Burgeoning AI Data Challenges
Enterprises face managing diverse data sources at multiple sites – on-premises, at the edge and in multiple cloud environments – as well as different systems for varying applications and shared access to data sources. Data is created at the edge and must be moved between the edge and the data center to be actionable. The rising demand for distributed unstructured data requires metadata tagging to ensure it moves swiftly, efficiently and securely.
Additional challenges, including increasing electricity costs and a lack of data center real estate, mean enterprises must optimize and fully utilize all resources with storage operating as efficiently as possible. DDN Infinia removes hardware dependencies with built-in core features to answer the requirements for secure enterprise-wide AI data management.