Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Docathon][Add CN Doc No.56-57] #6358

Merged
merged 6 commits into from
Jan 17, 2024
Merged

Conversation

zade23
Copy link
Contributor

@zade23 zade23 commented Dec 7, 2023

PR types

Others

PR changes

Docs

Description

中文文档添加任务

#6193

新增中文文档:

  • fused_bias_dropout_residual_layer_norm
  • FusedBiasDropoutResidualLayerNorm

英文文档链接:

序号 API名称 英文文档地址
56 paddle.incubate.nn.functional.fused_bias_dropout_residual_layer_norm https://www.paddlepaddle.org.cn/documentation/docs/en/develop/api/paddle/incubate/nn/functional/fused_bias_dropout_residual_layer_norm_en.html
57 paddle.incubate.nn.FusedBiasDropoutResidualLayerNorm https://www.paddlepaddle.org.cn/documentation/docs/en/develop/api/paddle/incubate/nn/FusedBiasDropoutResidualLayerNorm_en.html

@ZzSean @sunzhongkai588

Copy link

paddle-bot bot commented Dec 7, 2023

感谢你贡献飞桨文档,文档预览构建中,Docs-New 跑完后即可预览,预览链接:http://preview-pr-6358.paddle-docs-preview.paddlepaddle.org.cn/documentation/docs/zh/api/index_cn.html
预览工具的更多说明,请参考:飞桨文档预览工具

Copy link

paddle-bot bot commented Dec 7, 2023

感谢你贡献飞桨文档,文档预览构建中,Docs-New 跑完后即可预览,预览链接:http://preview-pr-6358.paddle-docs-preview.paddlepaddle.org.cn/documentation/docs/zh/api/index_cn.html
预览工具的更多说明,请参考:飞桨文档预览工具

Copy link

@Vvsmile Vvsmile left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

请修改Review处,非常感谢您的贡献!


.. py:function:: paddle.incubate.nn.functional.fused_bias_dropout_residual_layer_norm(x, residual, bias=None, ln_scale=None, ln_bias=None, dropout_rate=0.5, ln_epsilon=1e-05, training=True, mode='upscale_in_train', name=None)

融合偏置、Dropout 和残差层归一化操作符。其伪代码如下:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

”融合偏置、Dropout 和残差层归一化“不需要翻译,直接饮用原操作符名。如改为“fused_bias_dropout_residual_layer_norm操作符。”

若想增加功能介绍,可修为形式 ”fused_bias_dropout_residual_layer_norm操作符, 包含融合偏置、Dropout 和残差层归一化。“

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done
修改前:融合偏置、Dropout 和残差层归一化操作符。
修改后:fused_bias_dropout_residual_layer_norm 操作符,包含融合偏置、Dropout 和残差层归一化。

- **dropout_rate** (float,可选) - 在注意力权重上使用的 Dropout 概率,用于在注意力后的 Dropout 过程中丢弃一些注意力目标。0 表示无 Dropout。默认为 0.5。
- **ln_epsilon** (float,可选) - 在层归一化的分母中添加的小浮点数,用于避免除以零。默认为 1e-5。
- **training** (bool,可选) - 表示是否处于训练阶段的标志。默认为 True。
- **mode** (str,可选) - ['upscale_in_train'(默认) | 'downscale_in_infer'],有两种模式:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

“有两种模式” -> "两种模式分别为:"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


.. py:class:: paddle.incubate.nn.FusedBiasDropoutResidualLayerNorm(embed_dim, dropout_rate=0.5, weight_attr=None, bias_attr=None, epsilon=1e-05, name=None)

应用融合偏置、Dropout 和残差层归一化操作。
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

使用原算子名,无需翻译,如果要解释操作,请在后续补充。
推荐修改为
”应用fused_bias_dropout_residual_layer_norm操作符,包含融合偏置、Dropout 和残差层归一化操作。“

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done
修改前:应用融合偏置、Dropout 和残差层归一化操作。
修改后:应用fused_bias_dropout_residual_layer_norm操作符,包含融合偏置、Dropout 和残差层归一化操作。

2. downscale_in_infer,在推理时下调输出
- 训练:out = input * mask
- 推理:out = input * (1.0 - p)
- **name** (str,可选) - 操作的名称(可选,默认为 None)。更多信息,请参考:ref:`api_guide_Name`。
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

此处的“:ref:api_guide_Name” 同FusedBiasDropoutResidualLayerNorm(上一处填充)中文文档中的“: ref:cn_api_paddle_ParamAttr” 不一致,请统一格式

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Vvsmile 开发者您好
关于这两处引用的格式,可能是正确的。
我在中文文档中找到了同时引用(NameParamAttr)的文档FusedMultiHeadAttention
image

若是我理解错您的意思了,或有其他方面的修改需求,烦请更新Review

Copy link

@Vvsmile Vvsmile Jan 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

细节问题,统一一下ref前是中文冒号还是英文冒号,表述不清引起的额外工作量,烦请见谅

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

细节问题,统一一下ref前是中文冒号还是英文冒号

了解了,目前修改中已经统一使用英文冒号了

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

辛苦


COPY-FROM: paddle.incubate.nn.FusedBiasDropoutResidualLayerNorm

forward(src, src_mask=None, cache=None)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

此处和英文文档无法对应,请再次确定来源是否有误

应修改为forward(x, residual),代码中也是此格式,请本人也重新确认一下。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done
已确认,错误表述可能是翻译时函数对应错误。
修改为:forward(x, residual)


forward(src, src_mask=None, cache=None)
::::::::::::
应用融合偏置、Dropout 和残差层归一化操作。
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

使用原算子名,无需翻译,如果要解释操作,请在后续补充。
推荐修改为
”应用fused_bias_dropout_residual_layer_norm操作符,包含融合偏置、Dropout 和残差层归一化操作。“

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done
修改前:应用融合偏置、Dropout 和残差层归一化操作。
修改后:应用fused_bias_dropout_residual_layer_norm操作符,包含融合偏置、Dropout 和残差层归一化操作。


返回
::::::::::::
Tensor|tuple:数据类型与 ``x`` 一样。
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

根据英文文档 应翻译为 “与x具有相同数据类型和形状的张量”

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


extra_repr()
::::::::::::
返回当前层的额外信息。
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

根据英文文档 应翻译为“当前层的额外表示,可以有使用者建立的层的自定义实现”(可以想一下更好的翻译)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done
修改前:返回当前层的额外信息。
修改为:当前层的额外表示,您可以自定义实现自己的层。

@zade23 zade23 requested a review from Vvsmile January 17, 2024 06:27
@Vvsmile
Copy link

Vvsmile commented Jan 17, 2024

LGTM

zade23 and others added 2 commits January 17, 2024 15:14
…dual_layer_norm_cn.rst

Co-authored-by: zachary sun <70642955+sunzhongkai588@users.noreply.github.com>
…dual_layer_norm_cn.rst

Co-authored-by: zachary sun <70642955+sunzhongkai588@users.noreply.github.com>
Copy link
Collaborator

@sunzhongkai588 sunzhongkai588 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

好像得用 pre-commit 再跑一下

@zade23
Copy link
Contributor Author

zade23 commented Jan 17, 2024

好像得用 pre-commit 再跑一下

Done
image

Copy link
Collaborator

@sunzhongkai588 sunzhongkai588 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@luotao1 luotao1 merged commit e587b96 into PaddlePaddle:develop Jan 17, 2024
2 checks passed
@zade23 zade23 deleted the en_doc_5657 branch January 17, 2024 08:52
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
contributor HappyOpenSource 快乐开源活动issue与PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants