BEIJING, July 17 (TiPost)— Huawei opened new battlefield amid sweeping artificial intelligence (AI) race when other tech peers ramped up development of large-scale models.
Credit:Visual China
Huawei launched two AI storage products– OceanStor A310 and FusionCube A3000. The former is designed for storage of deep learning data for foundation models or industrial models. It enables massive data management in the whole process of AI from data collection, and preprocessing to model training and inference application. The global file system (GFS) enables OceanStor A310 to deliver cross-regional intelligent data weaving and streamline the data collection process deliver. It can reduce invalid data transmission with near-data preprocessing through near-memory computing, and improve preprocessing efficiency by 30% through reduction of data migration.
FusionCube A3000 is an all-in-one machine that integrates into training and inference, mainly serving the large-scale industrial model and tens of billions of model applications. It can be deployed within two hours with integration of high-performance storage nodes, training or inference nodes, switching devices, AI platform software, and management and operation software. With high-performance container, the product allows customers to share GPUs for multiple tasks of model training and inference and improve, increasing efficiency of resources from 40% to more than 70%.
“In the era of the large model, data determines how smart an AI can become, and as the carrier of data, data storage has become a key infrastructure for large AI models,” Zhou Yuefeng, vice president and head of Huawei Intellligent Data and Storage Business Unit, noted in an event of the release.
The storage offerings can be deemed as Huawei’s one-two punch in response to AI frenzy that attracts increasing companies at home and abroad to jump on the bandwagon. Huawei just unveiled Huawei Cloud Pangu Models 3.0, the latest version of its Pangu pre-trained deep learning AI model, about a week ago.
Unlike the mainstream large models like ChatGPT, Pangu Models 3.0 doesn’t know how to compose poems, but it can do concrete works with three purposes for innovation—reshaping industries, honing technologies and sharing success, to continue to build core competitiveness and serve customers, partners in the industry and developers with better services, said Zhang Pingan, executive director of Huawei and CEO of Huawei Cloud.
As an industry-oriented large model, Pangu Models 3.0 is made up of three layers with so-called “5+ N+ X” architecture. The foundation layer, L0, represents the “5”. It provides various skills to meet the needs of different industry scenarios with its five foundational models in natural language processing, computer vision, multimodal learning, prediction and scientific computing. Pangu Models 3.0 is available in serialized foundational models with four different parameters numbers including 10 billion, 38 billion, 71 billion and 100 billion, to satisfy customers’ diversified industry demands in terms of scenario, latency, and response speed.
L1, the second layer, offers a variety of industry-specific models, focusing on fields such as e-government, finance, manufacturing, mining and meteorology. It can also train customers’ exclusive large models with their own data based on L0 layer and L1 layer. The third layer, L2, provides multiple scenario-specific models for particular industry applications or business scenarios, such as government service contact, screening for drug development, foreign object detection on conveyor belts and typhoon path prediction.
更多精彩内容,关注钛媒体微信号(ID:taimeiti),或者下载钛媒体App