英文字典中文字典


英文字典中文字典51ZiDian.com



中文字典辞典   英文字典 a   b   c   d   e   f   g   h   i   j   k   l   m   n   o   p   q   r   s   t   u   v   w   x   y   z       







请输入英文单字,中文词皆可:

unsay    
vt. 取消,撤回

取消,撤回

unsay
v 1: take back what one has said; "He swallowed his words" [synonym:
{swallow}, {take back}, {unsay}, {withdraw}]

Unsay \Un*say\, v. t. [1st pref. un- say.]
To recant or recall, as what has been said; to refract; to
take back again; to make as if not said.
[1913 Webster]

You can say and unsay things at pleasure. --Goldsmith.
[1913 Webster]


请选择你想看的字典辞典:
单词字典翻译
Unsay查看 Unsay 在百度字典中的解释百度英翻中〔查看〕
Unsay查看 Unsay 在Google字典中的解释Google英翻中〔查看〕
Unsay查看 Unsay 在Yahoo字典中的解释Yahoo英翻中〔查看〕





安装中文字典英文字典查询工具!


中文字典英文字典工具:
选择颜色:
输入中英文单字

































































英文字典中文字典相关资料:


  • Supervised Fine-Tuning Achieve Rapid Task Adaption Via Alternating . . .
    Thus, in this paper, we employ a gradient-based method, to dissect the process that the SFT process adapts LLMs to downstream tasks via the perspective of attention patterns
  • Supervised Fine-Tuning Achieve Rapid Task Adaption Via Alternating . . .
    To address this issue, we propose to analyze the prerequisite and mechanism of such rapid task adaption during SFT from the perspective of activation patterns of attention heads using a gradient-based method
  • Supervised Fine-Tuning: An Activation Pattern Optimization Process for . . .
    To address this issue, we propose to analyze how SFT adapts LLMs to different tasks through the perspective of activation patterns of attention heads using a gradient-based method
  • arXiv:2409. 15820v2 [cs. LG] 18 Oct 2024
    In this section, we analyze changes in attention head activation patterns before and after SFT across various tasks and explore the relationships between these changes
  • 哈工大SCIR 29篇长文被ACL 2025主会 Findings录用
    在本研究中,我们提出了 CC-Tuning,这是一种新颖的多语言微调范式,它明确地在潜空间层面建立了跨语言连接机制。 在训练过程中,CC-Tuning 融合了来自英语和非英语输入的前馈激活,使模型能够从两种语言资源中获益。
  • Supervised Fine-Tuning Achieve Rapid Task Adaption Via Alternating . . .
    Thus, in this paper, we employ a gradient-based method, to dissect the process that the SFT process adapts LLMs to downstream tasks via the perspective of attention patterns
  • Supervised Fine-Tuning Achieve Rapid Task Adaption Via Alternating . . .
    Table 1: Statistics on the distribution of activation patterns for different LLMs Experiments were conducted on tasks such as Code Search Net, GSM8k, MATH, SGSM, ARC, HellaSwag, and Winogrande
  • Supervised Fine-Tuning: An Activation Pattern Optimization . . .
    本文旨在探讨如何提高LLMs在处理复杂任务时的效率和效果,以及解决数据稀缺的挑战。 通过梯度下降方法,从注意力模式的角度研究了SFT过程对LLMs适应下游任务的影响,发现LLMs在SFT过程中会选择性地激活特定的注意力头,复杂任务的激活模式是基本任务模式的组合,少量参数的变化就可以显著影响SFT后的激活模式。 本文揭示了LLMs快速学习和泛化机制的潜在原因,并提出了实用的解决方案,可以提高SFT的效率和效果,特别是在处理复杂任务和数据稀缺的情况下。 实验设计合理,使用了多个数据集,并提供了开源代码。
  • Kai Xiong - Homepage
    I’m Kai Xiong, a Ph D student in Research Center for Social Computing and Interactive Robotics (SCIR), at Harbin Institute of Technology (HIT, China) I am co-advised by Prof Ting Liu and Prof Xiao Ding My research interests lie in event reasoning, eventic graph, and large language models
  • Supervised Fine-Tuning: An Activation Pattern Optimization Process for . . .
    The paper introduces a supervised fine-tuning method that optimizes the activation patterns of attention heads in large language models By guiding the models to focus on the most relevant parts of the input, this approach can enhance the performance and interpretability of LLMs on specific tasks





中文字典-英文字典  2005-2009