neural network - How to write a custom MXNet layer with learned parameters -
i'm following documentation in http://mxnet.io/how_to/new_op.html how define new neural network layer in mxnet in python subclassing themx.operator.customop
class. example of loss layer has no learned parameters. how learned parameters forward
, backward
methods?
i figured out. learned parameters configured other input op. they're configured in list_arguments
method. docs page on writing custom symbols:
note list arguments declares both input , parameter , recommend ordering them
['input1', 'input2', ... , 'weight1', 'weight2', ...]
Comments
Post a Comment