chemicalchecker.tool.adanet.dnn_stack_generator.StackDNNGenerator
- class StackDNNGenerator(optimizer, input_shape, nan_mask_value=0.0, layer_size=32, learn_mixture_weights=False, dropout=0.0, activation=<function relu>, seed=None, **kwargs)[source]
Bases:
Generator
Generates a two DNN subnetworks at each iteration.
The first DNN has an identical shape to the most recently added subnetwork in previous_ensemble. The second has the same shape plus one more dense layer on top. This is similar to the adaptive network presented in Fig2 [Cortes et al. ICML 2017](https://arxiv.org/abs/1607.01097), without the connections to hidden layers of networks from previous iterations.
Initializes a DNN Generator.
- Parameters:
optimizer – An Optimizer instance for training both the subnetwork and the mixture weights.
layer_size – Number of nodes in each hidden layer of the subnetwork candidates. Note that this parameter is ignored in a DNN with no hidden layers.
learn_mixture_weights – Whether to solve a learning problem to find best mixture weights, or use their default value according to the mixture weight type. When False, the subnetworks will return a no_op for the mixture weight train op.
dropout – The dropout rate, between 0 and 1. E.g. “rate=0.1” would drop out 10% of input units.
activation – The activation function to be used.
seed – A random seed.
- Returns:
An instance of Generator.
Methods
See adanet.subnetwork.Generator.