NETAPP Malaysia Sdn Bhd has announced new developments aimed at accelerating artificial intelligence (AI) innovation in Malaysia by bolstering the country’s goal of becoming Southeast Asia’s AI hub.
By providing essential infrastructure for generative AI (GenAI), the data infrastructure company that provides unified data storage, integrated data services and cloud operations solutions to enterprise customers said it aspires to enable Malaysian businesses harness transformative technologies and overcome data challenges.
According to country manager Alwyn David, NetApp’s latest innovative position it to help Malaysian businesses unlock the full value of their data with AI and achieve their strategic goals.
“GenAI powers practical business applications such as content generation, summarising extensive information and interactive Q&A sessions,” shared David. “Success in the AI era hinges on managing data that is governable, trusted and traceable.”
Technological research and consulting firm Gartner has forecast that AI software spending would reach US$297.9 bil (RM1.24 bil) by 2027 with GenAI accounting for over a third of such investment.
At the NetApp INSIGHT 2024 conference held recently, CEO George Kurian emphasised that AI challenges are fundamentally data challenges.
He outlined how intelligent data infrastructure can secure, govern and continually update relevant data to support a unified GenAI stack.
NetApp is undergoing NVIDIA certification for its ONTAP storage system with the NVIDIA DGX SuperPOD AI infrastructure enabling organisations to manage large-scale AI projects effectively, particularly those involving large language models.
The company has created a global metadata namespace to allow secure exploration and management of data across hybrid multi-cloud environments, facilitating feature extraction and data classification for AI applications.
Additionally, NetApp ONTAP now offers an integrated AI data pipeline that automates the preparation of unstructured data, enabling high-scale, low-latency semantic searches and retrieval augmented generation (RAG) inferencing.
A new disaggregated storage architecture maximises network and flash speeds while reducing infrastructure costs, enhancing performance for compute-intensive AI workloads such as large language model training.
Additionally, NetApp is also enhancing its native cloud services by providing an integrated data platform for data ingestion, discovery and cataloguing.
Prepared datasets can be securely shared and utilised with cloud providers’ AI and machine learning services, including third-party solutions.
Plans include integrating Google Cloud NetApp Volumes as a data store for BigQuery and Vertex AI, hence offering Malaysian businesses seamless cloud integration.
“Organisations across Malaysia are experimenting with GenAI to boost efficiency and drive innovation,” observed David. “By providing secure, scalable and high-performance data infrastructure that integrates with leading platforms, NetApp helps customers overcome barriers to implementing GenAI.”
The company has also announced various industry collaborations, including one with Domino Data Lab which has selected Amazon FSx for NetApp ONTAP as the storage foundation for its cloud platform, offering cost-effective performance and scalability.
Meanwhile, the NetApp AIPod with Lenovo ThinkSystem servers for NVIDIA OVX is now generally available, designed to help enterprises leverage generative AI capabilities to enhance productivity and unlock new revenue opportunities.
Furthermore, NetApp is introducing new features for its FlexPod AI solution, simplifying and securing AI applications for Malaysian businesses by combining Cisco compute and networking with NetApp storage. – Oct 2, 2024