Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PaddleMIX ppdiffusers中升级attention相关代码 #262

Closed
chenjjcccc opened this issue Oct 31, 2023 · 10 comments
Closed

PaddleMIX ppdiffusers中升级attention相关代码 #262

chenjjcccc opened this issue Oct 31, 2023 · 10 comments
Labels
HappyOpenSource Pro 快乐开源issue与PR,更具挑战的任务

Comments

@chenjjcccc
Copy link

chenjjcccc commented Oct 31, 2023

PaddleMIX ppdiffusers中升级attention相关代码

任务描述

任务背景

  • models模块代码升级。
  • 升级脚本:
    • models/attention.py
    • models/attention_processor.py

完成步骤

  1. 对比ppdiffusers (v0.19.3)和diffusers (v0.21.1)的代码差异,可使用页面diff工具
  2. fork PaddleMix仓库并对题目中提到的代码文件进行更新
  3. 学习掌握pytest测试工具的使用,并确保升级后的代码文件能够通过tests/models目录下的单元测试用例(如有单测代码更新需要先进行对应单测升级再测试),确保升级后的代码正确。
  4. 提交PR到主仓库的develop分支,并描述任务具体完成情况(附上测试信息、推理用例)

提交内容:

  1. 升级后的代码文件。
  2. 测试情况。

题目更新:

  1. 对比ppdiffusers (v0.19.3)和diffusers (v0.21.1)的代码差异,可使用页面diff工具

当前还请按照diffusers最新稳定版本0.23.1更新升级

@unicornshell
Copy link

unicornshell commented Nov 21, 2023

ppdiffusers里attention.py里GEGLU模块缺少
def gelu(self, gate): if gate.device.type != "mps": return F.gelu(gate),但这并不在代码差异中,需要更改吗

@unicornshell
Copy link

似乎缺少models/attention.py的单测

@LokeZhou
Copy link
Collaborator

ppdiffusers里attention.py里GEGLU模块缺少 def gelu(self, gate): if gate.device.type != "mps": return F.gelu(gate),但这并不在代码差异中,需要更改吗

不需要

@LokeZhou
Copy link
Collaborator

似乎缺少models/attention.py的单测

如果torch 的diffusers里有这个单测则添加上去,如果没有就不需要添加

@unicornshell
Copy link

unicornshell commented Nov 22, 2023

diffusers的attention_processor.py里存在的pytorch2.0版本的类需要在ppdiffusers里实现吗 像:
AttnAddedKVProcessor2_0,AttnProcessor2_0,LoRAAttnProcessor2_0

@unicornshell
Copy link

unicornshell commented Nov 22, 2023

缺少2_0的类会导致新增的

ADDED_KV_ATTENTION_PROCESSORS = (
    AttnAddedKVProcessor,
    SlicedAttnAddedKVProcessor,
    AttnAddedKVProcessor2_0,
    XFormersAttnAddedKVProcessor,
    LoRAAttnAddedKVProcessor,
)

CROSS_ATTENTION_PROCESSORS = (
    AttnProcessor,
    AttnProcessor2_0,
    XFormersAttnProcessor,
    SlicedAttnProcessor,
    LoRAAttnProcessor,
    LoRAAttnProcessor2_0,
    LoRAXFormersAttnProcessor,
)

AttentionProcessor = Union[
    AttnProcessor,
    AttnProcessor2_0,
    XFormersAttnProcessor,
    SlicedAttnProcessor,
    AttnAddedKVProcessor,
    SlicedAttnAddedKVProcessor,
    AttnAddedKVProcessor2_0,
    XFormersAttnAddedKVProcessor,
    CustomDiffusionAttnProcessor,
    CustomDiffusionXFormersAttnProcessor,
    # depraceted
    LoRAAttnProcessor,
    LoRAAttnProcessor2_0,
    LoRAXFormersAttnProcessor,
    LoRAAttnAddedKVProcessor,
]

部分出现类似于"AttnProcessor2_0" is not defined的warning

@shiyutang
Copy link
Collaborator

@LokeZhou 请问这个PR #322 是否很快就能合入。

@LokeZhou
Copy link
Collaborator

@LokeZhou 请问这个PR #322 是否很快就能合入。

ci通过后合入

@LokeZhou
Copy link
Collaborator

diffusers的attention_processor.py里存在的pytorch2.0版本的类需要在ppdiffusers里实现吗 像: AttnAddedKVProcessor2_0,AttnProcessor2_0,LoRAAttnProcessor2_0

如果功能一致不需要;如果是因为2.0版本导致新增功能,则需要。

@shiyutang
Copy link
Collaborator

任务已完成 by @co63oc ,close issue。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
HappyOpenSource Pro 快乐开源issue与PR,更具挑战的任务
Projects
None yet
Development

No branches or pull requests

4 participants