History
version 0.17.25 [7-19-2019]
- NEW: Add
only_output_to_filearg toutil.sys_output_tapclass. You can disable screen output by set this flag toTrue, this will make the script run silently.
version 0.17.24 [4-3-2019]
- NEW: Add
Anti-996 Licenseas auxiliary license
version 0.17.23 [3-5-2019]
- NEW: now
BatchNorm'smeancan be set toNoneto disable mean substraction
version 0.17.22 [2-28-2019]
- FIXED:
self.bshould be set toNoneif not specified forConv2Dmodule
version 0.17.21 [2-13-2019]
- NEW: now
BatchNorm'sinv_stdcan be set toNoneto disable variance scaling
version 0.17.20 [2-11-2019]
- FIXED:
self.bshould be set toNoneif not specified forDensemodule
version 0.17.19 [1-23-2019]
- NEW: add
clear_nanargument forsgd,adamandadadeltaoptimizers. - MODIFIED: add default value 1e-4 for
sgd'slearning_ratearg.
version 0.17.17 [11-22-2018]
- NEW: add
GroupNormmodule for group normalization implementation; - MODIFIED: expose
dim_broadcastarg forModule.register_param()method; - MODIFIED: replace
spec = tensor.patternbroadcast(spec, dim_broadcast)withspec = theano.shared(spec, broadcastable=dim_broadcast)forutil.create_param(), due to the former would change tensor's type.
version 0.17.16 [11-19-2018]
- FIXED: wrong scale of
model_sizeforext.visual.get_model_summary(); - MODIFIED: add
stridearg forShuffleUnit_StackandShuffleUnit_v2_Stack; addfusion_modearg forShuffleUnit_Stack; improve their documentation.
version 0.17.15 [11-16-2018]
- NEW: add
ext.visualsub-module, containing model summarization & visualization toolkits.
version 0.17.14 [11-15-2018]
- FIXED: remove redundant
bn5layer ofShuffleUnit_v2whenstride= 1 andbatchnorm_mode= 2.
version 0.17.13 [11-13-2018]
- NEW: add
model.Alternate_2D_LSTMmodule for 2D LSTM implementaton by alternating 1D LSTM along different input dimensions.
version 0.17.12 [11-6-2018]
- NEW: add
LSTM2Dmodule for 2D LSTM implementation - NEW: add
.todevice()interface toModuleclass for possible support of model-parallel multi-GPU training. However due to Theano issue 6655, this feature won't be finished, so use it at your own risk. - MODIFIED:
activationparam ofSequentialclass now supports list input. - MODIFIED: merge pull request #1, #2, now
functional.spatial_pyramid_pooling()supports 3 different implementations.
version 0.17.11 [9-3-2018]
- MODIFIED: returned
bbox's shape is changed to (B, H, W, k, n) formodel_CTPN
version 0.17.10 [8-16-2018]
- MODIFIED: add
border_modearg todandelion.ext.CV.imrotate(), theinterpolationarg type is changed to string. - MODIFIED: returned
bbox's shape is changed to (B, H, W, n*k) formodel_CTPN
version 0.17.9 [8-7-2018]
- NEW: add
theano_safe_run, Finite_Memory_Array, pad_sequence_into_array, get_local_ip, get_time_stampintodandelion.util - NEW: add documentation of
dandelion.util, unfinished.
version 0.17.8 [8-3-2018]
- MODIFIED: disable the auto-broadcasting in
create_param(). This auto-broadcasting would result in theano exception for parameter with shape = [1]. - NEW: add
model.shufflenet.ShuffleUnit_v2_Stackandmodel.shufflenet.ShuffleNet_v2for ShuffleNet_v2 reference implementation. - NEW: move
channel_shuffle()frommodel.shufflenet.pyintofunctional.py
version 0.17.7 [8-2-2018]
From this version the documentaiton supports latex math officially.
* MODIFIED: move arg alpha of Module.Center from class delcaration to its .forward() interface.
version 0.17.6 [7-25-2018]
- MODIFIED: change default value of
Module.set_weights_by_name()'s argunmatchedfromignoretoraise - MODIFIED: change default value of
model.vgg.model_VGG16()'s argflip_filtersfromTruetoFalse
version 0.17.5 [7-20-2018]
- FIXED: fixed typo in
objective.categorical_crossentropy()
version 0.17.4 [7-20-2018]
- NEW: add class weighting support for
objective.categorical_crossentropy()andobjective.categorical_crossentropy_log() - NEW: add
util.theano_safe_run()to help catch memory exceptions when running theano functions.
version 0.17.3 [7-18-2018]
- FIXED: pooling mode in
model.shufflenet.ShuffleUnitchanged toaverage_inc_padfor correct gradient.
version 0.17.2 [7-17-2018]
- NEW: add
model.shufflenet.model_ShuffleSegfor Shuffle-Seg model reference implementation.
version 0.17.1 [7-12-2018]
- MODIFIED: modify all
Test/Test_*.pyto be compatible with pytest. - NEW: add Travis CI for automatic unit test.
version 0.17.0 [7-12-2018]
In this version the Module's parameter and sub-module naming conventions are changed to make sure unique name for each variable/module in a complex network.
It's incompatible with previous version if your work accessed their names, otherwise there is no impact.
Note: to set weights saved by previous dandelion(>=version 0.14.0), use .set_weights() instead of .set_weights_by_name(). For weights saved by dandelion of version < 0.14.0, the quick way is to set the model's submodule weight explicitly as model_new_dandelion.conv1.W.set_value(model_old_dandelion.conv1.W.get_value()).
From this version, it's recommonded to let the framework auto-name the module parameters when you define your own module with register_param() and register_self_updating_variable().
- MODIFIED: module's variable name convention changed to
variable_name@parent_module_nameto make sure unique name for each variable in a complex network - MODIFIED: module's name convention changed to
class_name|instance_name@parent_module_nameto make sure unique name for each module in a complex network - MODIFIED: remove all specified names for
register_param()andregister_self_updating_variable(). Leave the variables to be named automatically by their parent module. - MODIFIED: improve
model.shufflenet.ShuffleUnit. - NEW: add
Sequentialcontainer indandelion.modulefor usage convenience. - NEW: add
model.shufflenet.ShuffleUnit_Stackandmodel.shufflenet.ShuffleNetfor ShuffleNet reference implementation.
version 0.16.10 [7-10-2018]
- MODIFIED: disable all install requirements to prevent possible conflict of pip and conda channel.
version 0.16.9 [7-10-2018]
- MODIFIED: import all model reference implementations into
dandelion.model's namespace - FIXED:
ConvTransposed2D'sW_shapeshould usein_channelsas first dimension; incorrectW_shapewhennum_groups> 1.
version 0.16.8 [7-9-2018]
- NEW: add
model.shufflenet.DSConv2Dfor Depthwise Separable Convolution reference implementation. - NEW: add
model.shufflenet.ShuffleUnitfor ShuffleNet reference implementation - FIXED:
W_shapeofmodule.Conv2Dshould count fornum_groups
version 0.16.7 [7-6-2018]
- NEW: add
model.vgg.model_VGG16for VGG-16 reference implementation. - NEW: add
model.resnet.ResNet_bottleneckfor ResNet reference implementation - NEW: add
model.feature_pyramid_net.model_FPNfor Feature Pyramid Network reference implementation
version 0.16.6 [7-5-2018]
- NEW: add
functional.upsample_2d()for 2D upsampling - NEW: add
functional.upsample_2d_bilinear()for bilinear 2D upsampling
version 0.16.5 [7-5-2018]
- NEW: add
functional.spatial_pyramid_pooling()for SPP-net implementation.
version 0.16.4 [7-4-2018]
- FIXED: wrong indexing when
targetsis int vector forobjective.categorical_crossentropy_log().
version 0.16.3 [7-3-2018]
- NEW: add
activation.log_softmax()for more numerically stable softmax. - NEW: add
objective.categorical_crossentropy_log()for more numerically stable categorical cross-entropy - MODIFIED: add
epsargument toobjective.categorical_crossentropy()for numerical stability purpose. Note 1e-7 is set as default value ofeps. You can set it to 0 to get the oldcategorical_crossentropy()back.
version 0.16.0 [6-13-2018]
- NEW: add
extmodule into master branch of Dandelion. All the miscellaneous extensions will be organized in here. - NEW: add
ext.CVsub-module, containing image I/O functions and basic image processing functions commonly used in model training.
version 0.15.2 [5-28-2018]
- FIXED:
convTOPshould be constructed each time theforward()function ofConvTransposed2Dis called.
version 0.15.1 [5-25-2018]
- NEW: add
modelmodule into master branch of Dandelion - NEW: add U-net FCN implementation into
modelmodule - NEW: add
align_crop()intofunctionalmodule
version 0.14.4 [4-17-2018]
Rename updates.py with update.py
version 0.14.0 [4-10-2018]
In this version the Module's parameter interfaces are mostly redesigned, so it's incompatible with previous version.
Now self.params and self.self_updating_variables do not include sub-modules' parameters any more, to get all the parameters to be
trained by optimizer, including sub-modules' during training, you'll need to call the new interface function .collect_params().
To collect self-defined updates for training, still call .collect_self_updates().
- MODIFIED:
.get_weights()and.set_weights()traverse the parameters in the same order of sub-modules, so they're incompatible with previous version. - MODIFIED: Rewind all
trainableflags, you're now expected to use theincludeandexcludearguments in.collect_params()and.collect_self_updates()to enable/disable training for certain module's parameters. - MODIFIED: to define self-update expression for
self_updating_variable, use.updateattribute instead of previous.default_update - NEW: add auto-naming feature to root class
Module: if a sub-module is unnamed yet, it'll be auto-named by its instance name, from now on you don't need to name a sub-module manually any more. - NEW: add
.set_weights_by_name()toModuleclass, you can use this function to set module weights saved by previous version of Dandelion