基于GCN-MLP混合模型的延迟优化敏感单元预测方法

成泽祥, 冯超超, 赵振宇, 罗元盛

集成电路与嵌入式系统 ›› 0

集成电路与嵌入式系统 ›› 0 DOI: 10.20193/j.ices2097-4191.2025.0092

基于GCN-MLP混合模型的延迟优化敏感单元预测方法

  • 成泽祥, 冯超超, 赵振宇, 罗元盛
作者信息 +

A GCN-MLP Hybrid Model for Delay-Optimization-Sensitive Cell Prediction

  • CHENG Zexiang, FENG Chaochao, ZHAO Zhenyu, LUO Yuansheng
Author information +
文章历史 +

摘要

随着晶体管工艺节点持续微缩,纳米级集成电路时序收敛面临严峻挑战。传统电路仿真虽可评估单元网表与版图性能,但其计算密集型特性导致时间成本高昂。本文提出了一种融合图卷积网络(GCN)与多层感知机(MLP)的延迟优化敏感单元预测模型:首先基于输入信号状态动态调整网表晶体管尺寸;继而通过GCN解析单元网表结构,生成晶体管连接关系与工艺参数的同构图表征;最终将拓扑特征与传统时序特征融合输入MLP,预测单元优化潜力以定位延迟优化敏感单元。实验表明,对优化潜力最大的前10个延迟优化敏感单元,预测准确率达到了83.2%(前5单元达75.3%),相较于SPICE仿真,延迟优化敏感单元查找时间从小时级降至分钟级,加速约600倍。该方法可精准识别关键优化目标,为版图设计师提供晶体管级优化参数,显著提升时序收敛效率。

Abstract

With the continuous scaling down of transistor technology nodes, achieving timing closure in nanoscale integrated circuits faces severe challenges. Although traditional circuit simulation can evaluate the performance of cell netlists and layouts, its computationally intensive nature results in prohibitively high time costs. This paper proposes a delay-optimization-sensitive cell prediction model that integrates Graph Convolutional Networks (GCN) and Multilayer Perceptrons (MLP). The approach first dynamically adjusts transistor sizes in the netlist based on input signal states, then employs GCN to parse cell netlist structures and generate homogeneous graph representations of transistor connectivity relationships and process parameters. Finally, these topological features are fused with conventional timing characteristics and fed into an MLP to predict cell optimization potential, thereby identifying delay-optimization-sensitive cells. Experimental results demonstrate prediction accuracy rates of 83.2% for the top 10 delay-optimization-sensitive cells with the highest optimization potential and 75.3% for the top 5 such cells. Compared to SPICE simulation, the time required to identify delay-optimization-sensitive cells is reduced from hours to minutes, achieving approximately 600 times acceleration. This method can accurately identify critical optimization targets, provide layout designers with transistor-level optimization parameters, and significantly improve timing closure efficiency.

关键词

机器学习 / 延迟优化敏感单元 / 晶体管级时序优化 / 电路仿真 / 电路拓扑

Key words

machine learning / delay-optimization-sensitive cell / transistor-level timing optimization / circuit simulation / circuit topology

引用本文

导出引用
成泽祥, 冯超超, 赵振宇, 罗元盛. 基于GCN-MLP混合模型的延迟优化敏感单元预测方法[J]. 集成电路与嵌入式系统. 0 https://doi.org/10.20193/j.ices2097-4191.2025.0092
CHENG Zexiang, FENG Chaochao, ZHAO Zhenyu, LUO Yuansheng. A GCN-MLP Hybrid Model for Delay-Optimization-Sensitive Cell Prediction[J]. Integrated Circuits and Embedded Systems. 0 https://doi.org/10.20193/j.ices2097-4191.2025.0092

Accesses

Citation

Detail

段落导航
相关文章

/