neural network - How to write a custom MXNet layer with learned parameters -


i'm following documentation in http://mxnet.io/how_to/new_op.html how define new neural network layer in mxnet in python subclassing themx.operator.customop class. example of loss layer has no learned parameters. how learned parameters forward , backward methods?

i figured out. learned parameters configured other input op. they're configured in list_arguments method. docs page on writing custom symbols:

note list arguments declares both input , parameter , recommend ordering them ['input1', 'input2', ... , 'weight1', 'weight2', ...]


Comments

Popular posts from this blog

asp.net - How to correctly use QUERY_STRING in ISAPI rewrite? -

jsf - "PropertyNotWritableException: Illegal Syntax for Set Operation" error when setting value in bean -

arrays - Algorithm to find ideal starting spot in a circle -