Abstract:In the cloud-edge-device collaborative architecture, data types are diverse, and there are differences in storage resources and computing resources at all levels, which bring new challenges to data management. The existing data models or simple superposition of data models are difficult to meet the requirements of multimodal data management and collaborative management in the cloud-edge-device. Therefore, research on multimodal data modeling technology for cloud-edge-device collaboration has become an important issue. The core is how to efficiently obtain the query results that meet the needs of the application from the three-tier architecture of cloud-edge-device. Starting from the data types of the three-layer data of cloud-edge-device, this study proposes a multimodal data modeling technology for cloud-edge-device collaboration, gives the definition of multimodal data model based on tuples, and designs six base classes to solve the problem of unified representation of multimodal data. The basic data operation architecture of cloud-edge-device collaborative query is also proposed to meet the query requirements of cloud-edge-device business scenarios. The integrity constraints of the multimodal data model are given, which lays a theoretical foundation for query optimization. Finally, a demonstration application of the multimodal data model for cloud edge-device collaboration was given, and the proposed data model storage method was verified from three aspects of data storage time, storage space and query time. The experimental results show that the proposed scheme can effectively represent the multimodal data in the cloud-edge-device collaborative architecture.