22
33[ TOC]
44
5- ## What is MorphNet?
5+ ## New: FiGS: Fine-Grained Stochastic Architecture Search
6+ FiGS, is a probabilistic approach to channel regularization that we introduced
7+ in [ Fine-Grained Stochastic Architecture Search] ( https://arxiv.org/pdf/2006.09581.pdf ) .
8+ It outperforms our previous regularizers and can be used as either a pruning algorithm or
9+ a full fledged Differentiable Architecture Search method. This is the recommended
10+ way to apply MorphNet. In the below documentation it is
11+ referred to as the ` LogisticSigmoid ` regularizer.
12+
613
14+ ## What is MorphNet?
715MorphNet is a method for learning deep network structure during training. The
816key principle is continuous relaxation of the network-structure learning
917problem. In short, the MorphNet regularizer pushes the influence of filters down,
@@ -21,10 +29,9 @@ introduced in our [CVPR 2018](http://cvpr2018.thecvf.com/), paper "[MorphNet: Fa
2129Deep Network Structure] ( https://arxiv.org/abs/1711.06798 ) ". A overview of the
2230approach as well as device-specific latency regularizers were prestend in
2331[ GTC 2019] ( https://gputechconf2019.smarteventscloud.com/connect/sessionDetail.ww?SESSION_ID=272314 ) . [[ slides] ( g3doc//MorphNet_GTC2019.pdf " GTC Slides ") , recording: [ YouTube] ( https://youtu.be/UvTXhTvJ_wM ) , [ GTC on-demand] ( https://on-demand.gputechconf.com/gtc/2019/video/_/S9645/ )] .
32+ Our new, probabilistic, approach to pruning is called FiGS, and is detailed in
33+ [ Fine-Grained Stochastic Architecture Search] ( https://arxiv.org/pdf/2006.09581.pdf ) .
2434
25- ** NEW:** FiGS, is a probabilistic approach to channel regularization that we introduced
26- in [ Fine-Grained Stochastic Architecture Search] ( https://arxiv.org/pdf/2006.09581.pdf ) .
27- It outperforms our previous regularizers and can be used as either a pruning algorithm or a full fledged Differentiable Architecture Search method.
2835
2936## Usage
3037
@@ -45,12 +52,12 @@ To use MorphNet, you must:
4552
4653 * your target cost (e.g., FLOPs, latency)
4754 * Your ability to add new layers to your model:
48- * If possible, add
55+ * Add
4956 our probabilistic gating operation after any layer you wish to prune, and
50- use the ` LogisticSigmoid ` regularizers.
57+ use the ` LogisticSigmoid ` regularizers. ** \[ recommended \] **
5158 * If you are unable to add new layers, select regularizer type based on
5259 your network architecture: use ` Gamma ` regularizer if the seed network
53- has BatchNorm; use ` GroupLasso ` otherwise.
60+ has BatchNorm; use ` GroupLasso ` otherwise \[ deprecated \] .
5461
5562 Note: If you use BatchNorm, you must enable the scale parameters (“gamma
5663 variables”), i.e., by setting ` scale=True ` if you are using
@@ -155,7 +162,7 @@ in your model. In this example, the regularizer will traverse the graph
155162starting with ` logits ` , and will not go past any op that is earlier in the graph
156163than the ` inputs ` or ` labels ` ; this allows to specify the subgraph for MorphNet to optimize.
157164
158- # TODO Add Keras example.
165+ <!-- TODO Add Keras example. -->
159166``` python
160167from morph_net.network_regularizers import flop_regularizer
161168from morph_net.tools import structure_exporter
0 commit comments