summaryrefslogtreecommitdiff
path: root/src/caffe/layers/softmax_loss_layer.cpp
AgeCommit message (Expand)AuthorFilesLines
2015-03-08SoftmaxLossLayer fix: canonicalize input axisJeff Donahue1-1/+2
2015-03-03SoftmaxLossLayer generalized like SoftmaxLayerJeff Donahue1-19/+23
2015-02-16LayerRegistry uses shared_ptr instead of raw pointersJonathan L Long1-1/+1
2015-02-06Added GPU implementation of SoftmaxWithLossLayer.Sagan Bolliger1-0/+4
2015-02-05Layer type is a stringJeff Donahue1-3/+3
2015-01-29Merge pull request #1654 from longjon/softmax-missing-valuesEvan Shelhamer1-9/+35
2015-01-21SoftmaxWithLossLayer: use CreateLayer so that a CuDNNSoftmaxLayerJeff Donahue1-0/+4
2014-12-30clean up formatting in SoftmaxLossLayerJonathan L Long1-2/+1
2014-12-30add spatial normalization option to SoftmaxLossLayerJonathan L Long1-2/+11
2014-12-30add missing value support to SoftmaxLossLayerJonathan L Long1-5/+23
2014-12-29remove SoftmaxLossLayer CPU_ONLY stubs, since there is no GPU versionJonathan L Long1-5/+1
2014-11-23use DCHECK in SoftmaxLossLayer so that bounds checking has no perf penalty wi...Jonathan L Long1-2/+2
2014-11-22in SoftmaxLossLayer, check label >= 0 in addition to upper boundJonathan L Long1-0/+1
2014-10-13move registration code to corresponding cpp files.Yangqing Jia1-2/+1
2014-10-11Minor fixes. (1) cudnn.hpp uses CHECK_EQ internally, so it needs to include ...Kevin James Matzen1-1/+3
2014-09-19fix types of (Layer)SetUp, Reshape, Forward, and Backward callsJonathan L Long1-14/+14
2014-09-18split off Reshape for loss layersJonathan L Long1-0/+7
2014-08-19softmax and softmax loss layers work across channelsJonathan L Long1-6/+13
2014-08-13Generalize loss by allowing any top blob to be used as a loss in whichJeff Donahue1-17/+11
2014-08-06LICENSE governs the whole project so strip file headersEvan Shelhamer1-2/+0
2014-08-04Fix header alphabetization lint errors.Jeff Donahue1-1/+1
2014-07-18reapply namespace changeYangqing Jia1-4/+2
2014-07-17stub out GPU layer methods to crash loudly in CPU-only modeEvan Shelhamer1-0/+4
2014-07-03replace all memcpy by caffe_copyEvan Shelhamer1-1/+1
2014-06-26change Backward interface: propagate_down is a vector -- use to fixJeff Donahue1-12/+17
2014-06-25fix SOFTMAX_LOSS to work with loss top blob interfaceEvan Shelhamer1-1/+13
2014-06-19Now Loss layers would return the loss in the top blob if requestedSergio1-0/+3
2014-06-08layers declare their names and number of input/output blobs, and don'tJeff Donahue1-2/+1
2014-03-27Standardize copyright, add root-level CONTRIBUTORS creditEvan Shelhamer1-1/+1
2014-03-19revert unnecessary reordering of lines in softmaxwithlosslayer backwardJeff Donahue1-1/+1
2014-03-19change specification of forward/backward function and fix layerJeff Donahue1-8/+14
2014-03-18lint, except for rand/rand_rEvan Shelhamer1-1/+2
2014-02-26Splitting source files between CUDA and CPU code.Eric Tzeng1-0/+59