how to use my own loss to compute grad? #3917
Comments
metric is only for reporting. You need to write a loss layer |
mx.sym.MakeLoss |
Hi guys! I'm also wondering how to write a loss layer. I've seen the example/numpy-ops/custom_softmax.py, but I don't understand where exactly is the loss function is applied? It looks like it just applies softmax, but after we need a loss function (e.g. log-loss), where is it defined? I'd like to change it. |
yes, eval_metric is used to report the model train accuracy rate, eval_metric doesn't influence the model training. I also don't know how to define loss lay. I doesn't know what loss function on the Softmax lay. |
@piiswrong How to write a loss layer? I don't find an example about the loss layer. |
This issue is closed due to lack of activity in the last 90 days. Feel free to ping me to reopen if this is still an active issue. Thanks! |
Now, I want to use my own loss Class "MyLoss", but where to use it ? used like below?
model.fit(X=X_train, eval_metric=[MyLoss],
batch_end_callback=mx.callback.Speedometer(batch_size,50),)
The text was updated successfully, but these errors were encountered: