index
:
platform/upstream/caffeonacl
accepted/tizen_5.0_unified
accepted/tizen_5.5_unified
accepted/tizen_5.5_unified_mobile_hotfix
accepted/tizen_5.5_unified_wearable_hotfix
accepted/tizen_unified
armcl-v18.11
master
sandbox/daeinki/armcl-v18.08
sandbox/nmerinov/llvm
tizen
tizen_5.0
tizen_5.5
tizen_5.5_mobile_hotfix
tizen_5.5_tv
tizen_5.5_wearable_hotfix
Domain: Machine Learning / ML Framework; Licenses: BSD-2-Clause;
Inki Dae <inki.dae@samsung.com>
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
src
/
caffe
/
blob.cpp
Age
Commit message (
Expand
)
Author
Files
Lines
2017-01-06
Switched multi-GPU to NCCL
Cyprien Noel
1
-0
/
+18
2016-05-05
Allow reshaping blobs to size 0.
Eric Tzeng
1
-1
/
+3
2015-09-18
Blob: add SyncedMemory shape accessor for GPU shape access
Jeff Donahue
1
-0
/
+11
2015-08-07
add double_data, double_diff to BlobProto for weights/snapshots saved
Jeff Donahue
1
-7
/
+42
2015-05-06
check that count_ does not overflow in Blob::Reshape
Jonathan L Long
1
-0
/
+1
2015-03-03
Add option not to reshape to Blob::FromProto; use when loading Blobs
Jeff Donahue
1
-16
/
+20
2015-03-03
Add BlobShape message; use for Net input shapes
Jeff Donahue
1
-7
/
+18
2015-03-03
Blobs are ND arrays (for N not necessarily equals 4).
Jeff Donahue
1
-18
/
+74
2015-02-13
Blob: add scale_{data,diff} methods and tests
Jeff Donahue
1
-0
/
+66
2015-01-29
Blob: add sumsq_{data,diff} methods
Jeff Donahue
1
-0
/
+74
2014-09-18
don't reallocate blobs when shrinking memory use
Jonathan L Long
1
-7
/
+7
2014-08-06
LICENSE governs the whole project so strip file headers
Evan Shelhamer
1
-2
/
+0
2014-07-26
Print blob L1 norms during forward/backward passes and updates if
Jeff Donahue
1
-0
/
+70
2014-07-17
collect CUDA includes and calls, separate from CPU-only mode, leave out
Evan Shelhamer
1
-3
/
+4
2014-07-03
fix casts (static for void*)
Evan Shelhamer
1
-12
/
+12
2014-07-03
replace all memcpy by caffe_copy
Evan Shelhamer
1
-8
/
+8
2014-07-03
switch to unified virtual addressing CUDA memcpy
Evan Shelhamer
1
-2
/
+2
2014-05-24
make a Blob<unsigned int> and use in dropout layer
Jeff Donahue
1
-7
/
+6
2014-05-24
use a Blob<int> instead of a SyncedMemory to store max_idx_
Jeff Donahue
1
-0
/
+9
2014-05-02
add set_cpu_data to Blob and SyncedMemory
Jonathan L Long
1
-0
/
+6
2014-04-25
move analytic gradient computation outside loop and store -- saves a lot
Jeff Donahue
1
-0
/
+5
2014-04-12
change Adopt -> Share as suggested by kloudkl
Jeff Donahue
1
-2
/
+2
2014-04-11
add Adopt{Data,Diff} methods to blobs to enable "virtual copying"
Jeff Donahue
1
-0
/
+12
2014-03-27
Standardize copyright, add root-level CONTRIBUTORS credit
Evan Shelhamer
1
-1
/
+1
2013-09-30
solver
Yangqing Jia
1
-2
/
+18
2013-09-27
blob cpp bugfix
Yangqing Jia
1
-5
/
+2
2013-09-27
updated a bunch of things, ready to test if it breaks things
Yangqing Jia
1
-2
/
+40
2013-09-27
blob
Yangqing Jia
1
-1
/
+1
2013-09-27
cpplint
Yangqing Jia
1
-2
/
+2
2013-09-26
net
Yangqing Jia
1
-0
/
+1
2013-09-26
a lot of modifications - disallow copy constructors and misc
Yangqing Jia
1
-37
/
+11
2013-09-25
bugfix
Yangqing Jia
1
-0
/
+1
2013-09-25
softmax layer, test to be written
Yangqing Jia
1
-6
/
+20
2013-09-24
proto update
Yangqing Jia
1
-4
/
+6
2013-09-23
pylint and code cleaning
Yangqing Jia
1
-7
/
+10
2013-09-23
copyright message
Yangqing Jia
1
-0
/
+2
2013-09-22
naming. I might regret it someday.
Yangqing Jia
1
-0
/
+135