summaryrefslogtreecommitdiff
path: root/include
AgeCommit message (Collapse)AuthorFilesLines
2018-01-31v0.5.0huifang19-421/+934
2017-08-26add support acl batch normal,direct conv, local connect, concat layershonggui7-51/+362
2017-06-021. Porting Caffe onto ARM Compute Library.Yao Honggui17-3/+968
2. The release version is 0.2.0
2017-03-07Merge pull request #4630 from BlGene/load_hdf5_fixEvan Shelhamer1-2/+2
Made load_hd5 check blob dims by default, instead of reshaping.
2017-01-18Merge pull request #5098 from yaronli/masterEvan Shelhamer1-1/+4
check leveldb iterator status for snappy format
2017-01-17Merge pull request #4563 from cypof/ncclEvan Shelhamer15-233/+166
adopt NVIDIA's NCCL for multi-GPU and switch interface to python
2017-01-06Using default from proto for prefetchCyprien Noel1-3/+0
2017-01-06Python layers should build on multiprocess & solver_cnt; enable with bindingsMarian Gläser1-1/+1
2017-01-06Switched multi-GPU to NCCLCyprien Noel15-231/+167
2016-12-21Use mkl_malloc when use mklTomasz Socha1-0/+12
2016-12-16check leveldb iterator status for snappy format.liyangguang1-1/+4
2016-11-25Revert "solver: check and set type to reconcile class and proto"Evan Shelhamer1-2/+0
as pointed out by #5028 this does not achieve what it intended, and furthermore causes trouble with direct solver instantiation. revert commit e52451de914312b80a83459cb160c2f72a5b4fea
2016-11-21solver: check and set type to reconcile class and protoEvan Shelhamer1-0/+2
the solver checks its proto type (SolverParameter.type) on instantiation: - if the proto type is unspecified it's set according to the class type `Solver::type()` - if the proto type and class type conflict, the solver dies loudly this helps avoid accidental instantiation of a different solver type than intended when the solver def and class differ. guaranteed type information in the SolverParameter will simplify multi-solver coordination too.
2016-11-16sigmoid cross-entropy loss: normalize loss by different schemesEvan Shelhamer1-0/+11
sig-ce loss handles all the same normalizations as the softmax loss; refer to #3296 for more detail. this preserves the default normalization for sig-ce loss: batch size.
2016-11-15sigmoid cross-entropy loss: ignore selected targets by `ignore_label`Evan Shelhamer1-0/+5
sig-ce learns to ignore by zeroing out the loss/diff at targets equal to the configured `ignore_label`. n.b. as of now the loss/diff are not properly normalized when there are ignored targets. sig-ce loss should adopt the same normalization options as softmax loss.
2016-11-01corrected typo in accuracy_layer.hpp: MaxTopBlos -> MaxTopBlobsbaecchi1-1/+1
2016-10-27sigmoid cross-entropy loss: add GPU forward for full GPU modeEvan Shelhamer1-0/+2
close #3004
2016-10-22Fix: made load_hd5 check blob dims by default.max argus1-2/+2
Size checks are needed for loading parameters to avoid strange bugs when loading data we continue to reshape.
2016-09-12batch norm: auto-upgrade old layer definitions w/ param messagesEvan Shelhamer1-0/+6
automatically strip old batch norm layer definitions including `param` messages. the batch norm layer used to require manually masking its state from the solver by setting `param { lr_mult: 0 }` messages for each of its statistics. this is now handled automatically by the layer.
2016-09-12batch norm: hide statistics from solver, simplifying layer definitionEvan Shelhamer1-4/+2
batch norm statistics are not learnable parameters subject to solver updates, so they must be shielded from the solver. `BatchNorm` layer now masks its statistics for itself by zeroing parameter learning rates instead of relying on the layer definition. n.b. declaring `param`s for batch norm layers is no longer allowed.
2016-09-12[docs] identify batch norm layer blobsEvan Shelhamer1-11/+12
2016-09-09[docs] clarify handling of bias and scaling by BiasLayer, ScaleLayerEvan Shelhamer3-15/+15
A bias/scaling can be applied wherever desired by defining the respective layers, and `ScaleLayer` can handle both as a memory optimization.
2016-08-29Merge pull request #4647 from ClimbsRocks/patch-3Jeff Donahue1-1/+1
changes "c++" to "C++" for consistency
2016-08-29Merge pull request #4646 from ClimbsRocks/patch-2Jeff Donahue1-1/+1
fixes typo- duplicate "a a"
2016-08-28changes "c++" to "C++" for consistencyPreston Parry1-1/+1
2016-08-28fixes typo- duplicate "a a"Preston Parry1-1/+1
2016-08-28updates tense in docsPreston Parry1-1/+1
"could" seems to imply for some reason that something is blocking one from calling the registered layers. "can" lays out more directly that a user can choose to do this.
2016-08-18Merge pull request #3272 from ixartz/masterEvan Shelhamer1-0/+5
[cmake] OSX 10.10 (and more) use Accelerate Framework instead of veclib
2016-06-03Add level and stages to Net constructorLuke Yeager1-0/+1
This internal functionality will be exposed through the various interfaces in subsequent commits Also adds C++ tests for all-in-one nets
2016-06-01Add LSTMLayer and LSTMUnitLayer, with testsJeff Donahue1-0/+154
2016-06-01Add RNNLayer, with testsJeff Donahue1-0/+47
2016-06-01Add RecurrentLayer: an abstract superclass for other recurrent layer typesJeff Donahue1-0/+187
2016-05-16Add cuDNN v5 support, drop cuDNN v3 supportFelix Abecassis4-3/+24
cuDNN v4 is still supported.
2016-05-04add parameter layer for learning any bottomJonathan L Long1-0/+45
2016-05-04Merge pull request #3995 from ZhouYzzz/python-phaseJon Long1-0/+1
Allow the python layer have attribute "phase"
2016-04-20Don't set map_size=1TB in util/db_lmdbLuke Yeager1-5/+8
Instead, double the map size on the MDB_MAP_FULL exception.
2016-04-15Allow the python layer have attribute "phase"ZhouYzzz1-0/+1
2016-04-14CropLayer: groom commentsEvan Shelhamer1-0/+9
2016-03-05Merge pull request #3590 from junshi15/GPUUtilitiesJon Long1-0/+5
Add functions to check and grab GPU
2016-03-05Merge pull request #3588 from junshi15/P2psyncPrepareJon Long1-1/+4
Refine P2PSync
2016-03-05split p2psync::run()Jun Shi1-1/+4
2016-03-05Crop: fixes, tests and negative axis indexing.max argus1-2/+2
2016-03-05Extend Crop to N-D, changed CropParameter.max argus1-2/+20
2016-03-05add CropLayer: crop blob to another blob's dimensions with offsetsJonathan L Long1-0/+49
configure offset(s) through proto definition.
2016-03-04add check and find GPU device utilitiesJun Shi1-0/+5
2016-02-26Deprecate ForwardPrefilled(), Forward(bottom, loss) in lieu of droppingEvan Shelhamer1-0/+9
Relax removal of `Forward()` variations by deprecating instead.
2016-02-25collect Net inputs from Input layersEvan Shelhamer1-2/+11
Restore the list of net inputs for compatibility with the pycaffe and matcaffe interfaces and downstream C++.
2016-02-25drop Net inputs + Forward with bottomsEvan Shelhamer1-27/+7
Drop special cases for `input` fields, the `Net` input members, and the `Net` interface for Forward with bottoms along with Forward() / ForwardPrefilled() distinction.
2016-02-25deprecate input fields and upgrade automagicallyEvan Shelhamer1-0/+6
2016-02-25add InputLayer for Net inputEvan Shelhamer1-0/+44
Create an input layer to replace oddball Net `input` fields.