New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MetaSchedule] [CUDA target] Did you forget to bind? #43
Comments
After finishing tuning, I use: with args.target, db, tvm.transform.PassContext(opt_level=3):
mod_deploy = relax.transform.MetaScheduleApplyDatabase(enable_warning=True)(mod) It will show many warnings like: [17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: matmul23
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d14_add24_add25
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: take
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d37_add34_add35_divide7
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d24_add10
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d7_add10
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_matmul28_add27_add28
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_matmul11_add11_strided_slice4
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_conv2d4_add10_add12
[17:27:57] /home/wyc/husen/sandbox/tvm/src/relax/transform/meta_schedule.cc:162: Warning: Tuning record is not found for primfunc: fused_matmul9_add8_gelu Then I use I don't know why this happen. |
You can use diffusers==0.15.0 version and problems may all be solved. ^^ |
Hi @Civitasv, thanks for the question! We used I guess the mismatch you observed is because both the TIR extraction of |
The use of ms.relax_integration.tune_relax(
mod=mod_deploy,
target=tvm.target.Target("apple/m1-gpu-restricted"), # for WebGPU 256-thread limitation
params={},
builder=ms.builder.LocalBuilder(
max_workers=os.cpu_count(),
),
runner=ms.runner.LocalRunner(timeout_sec=60),
work_dir="log_db",
max_trials_global=50000,
max_trials_per_task=2000,
) |
Thanks for your reply. def do_all_tune(mod, target):
tunning_dir = "gpu3090_workdir"
tunning_record = "gpu3090/database_tuning_record.json"
tunning_workload = "gpu3090/database_workload.json"
cooldown_interval = 0
trial_cnt = 100
trial_per = 2
local_runner = ms.runner.LocalRunner(cooldown_sec=cooldown_interval, timeout_sec=60)
database = ms.relax_integration.tune_relax(
mod=mod,
target=target,
work_dir=tunning_dir,
max_trials_global=trial_cnt,
max_trials_per_task=trial_per,
runner=local_runner,
params={},
)
if os.path.exists(tunning_record):
os.remove(tunning_record)
if os.path.exists(tunning_workload):
os.remove(tunning_workload)
database.dump_pruned(
ms.database.JSONDatabase(
path_workload=tunning_workload,
path_tuning_record=tunning_record,
)
) Still saying #43 (comment). I wonder if it is relavent to the |
Yes, it is relevant. For 10000 and 2000 for trial_cnt and trial_per, only the |
Thanks @Civitasv! Glad that it works :-) |
Currently, the parameters I am using is as follows:
Could you kindly share the parameters you are using to generate the log? I'm curious to know.
The text was updated successfully, but these errors were encountered: