在过去的近10年中, 人工智能相关的服务和应用大规模出现, 它们要求高算力、高带宽和低时延. 边缘计算目前被认为是这些应用最适合的计算模式, 尤其是视频分析相关应用. 研究多服务器多用户异构视频分析任务卸载问题, 其中用户选择合适的边缘服务器, 并将他们的原始视频数据上传至服务器进行视频分析. 为了有效处理众多用户对有限网络资源的竞争和共享, 并且能够获得稳定的网络资源分配局面, 即每个用户不会单方面地改变自己的任务卸载决策, 该多服务器多用户异构视频分析任务卸载问题被建模为一个多玩家的博弈问题. 基于最小化整体时延的优化目标, 先后研究非分布式视频分析场景和分布式视频分析场景两种情形, 分别提出基于博弈论的潜在最优服务器选择算法和视频单元分配算法. 通过严格的数学证明, 两种情形下提出的算法均可以达到纳什均衡, 同时保证较低的整体时延. 最后, 基于真实数据集的大量实验表明, 所提方法比其他现有算法降低了平均26.3%的整体时延.
In the past decade or so, artificial intelligence-related services and applications have boomed, and they require high computing power, high bandwidth, and low latency. Edge computing is currently regarded as one of the most appropriate solutions for such applications, especially for video analysis-related ones. This study investigates multi-server multi-user heterogeneous video analysis task offloading, where users select appropriate edge servers and then upload their raw video data to the servers for video analysis. It models the issue of multi-server multi-user heterogeneous video analysis task offloading as a multiplayer game issue. The aim is to effectively deal with the competition for and sharing of the limited network resources among the numerous users and achieve a stable network resource allocation situation where each user has no incentive to change their task offloading decision unilaterally. With the optimization goal of minimizing the overall delay, this study successively investigates the non-distributed and distributed video analysis scenarios and proposes the game theory-based algorithms of potential optimal server selection and video unit allocation accordingly. Rigorous mathematical proof reveals that Nash equilibrium can be reached by the proposed algorithms in both of the two cases, and a low overall delay is guaranteed. Finally, extensive experiments on actual datasets show that the proposed methods reduce the overall delay by 26.3% on average, compared with that of other currently available algorithms.