SINGA-140: Fixed bug in CollectAll() function#141
SINGA-140: Fixed bug in CollectAll() function#141raunaqabhyankar wants to merge 1 commit intoapache:masterfrom
Conversation
|
Would you please change the commit message to follow this format "SINGA-xxx "? |
|
I'll change the commit message. |
|
here are the instructions: http://singa.apache.org/docs/general-rnn.html On Wed, Mar 30, 2016 at 6:24 PM, Raunaq Abhyankar notifications@github.com
|
|
Dear sir, |
|
have you tried to run the example? |
|
Original Code (no changes): $ ./bin/singa-run.sh -conf examples/char-rnn/job.conf Changed Code: $ ./bin/singa-run.sh -conf examples/char-rnn/job.conf @nudles This is the output. Before and after changes were made. |
|
Hi, If you do not have GPU (or CUDA), then comment out one line in job.conf |
|
Hey thanks for the tip! Changed code |
In SINGA_HOME/src/worker.cc, in “int Worker::CollectAll(int step, NeuralNet* net){}” function, the layers which are unrolled (except for the first one) should not collect parameters, due to parameter sharing.
Previous:
if (layer->partition_id() == id_)
Current changes:
if (layer->partition_id() == id_ && layer->unroll_index() == 0)
@kaiping