ultra.ranking_model package

Submodules

ultra.ranking_model.base_ranking_model module

The basic class that contains all the API needed for the implementation of a ranking model.

class ultra.ranking_model.base_ranking_model.ActivationFunctions

Bases: object

Activation Functions key strings.

ELU = 'elu'
RELU = 'relu'
SELU = 'selu'
SIGMOID = 'sigmoid'
TANH = 'tanh'
class ultra.ranking_model.base_ranking_model.BaseRankingModel(hparams_str=None, **kwargs)

Bases: abc.ABC

ACT_FUNC_DIC = {'elu': <function elu>, 'relu': <function relu>, 'selu': <function selu>, 'sigmoid': <function sigmoid>, 'tanh': <function tanh>}
INITIALIZER_DIC = {'constant': <tensorflow.python.ops.init_ops.Constant object>}
NORM_FUNC_DIC = {'batch': <class 'tensorflow.python.keras.layers.normalization.BatchNormalization'>, 'layer': <class 'tensorflow.python.keras.layers.normalization.LayerNormalization'>}
abstract __init__(hparams_str=None, **kwargs)

Create the network.

Parameters

hparams_str – (string) The hyper-parameters used to build the network.

abstract build(input_list, noisy_params=None, noise_rate=0.05, is_training=False, **kwargs)

Create the model

Parameters
  • input_list – (list<tf.tensor>) A list of tensors containing the features for a list of documents.

  • noisy_params – (dict<parameter_name, tf.variable>) A dictionary of noisy parameters to add.

  • noise_rate – (float) A value specify how much noise to add.

  • is_training – (bool) A flag indicating whether the model is running in training mode.

Returns

A list of tf.Tensor containing the ranking scores for each instance in input_list.

get_variable(name, shape, noisy_params=None, noise_rate=0.05, **kwargs)

Get a tensorflow variable for the model. Add noise if required.

Parameters
  • name – The name of the variable.

  • shape – The shape of the variable.

  • noisy_params – (dict<parameter_name, tf.variable>) A dictionary of noisy parameters to add.

  • noise_rate – (float) A value specify how much noise to add.

Returns

A tf.Tensor

model_parameters = {}
class ultra.ranking_model.base_ranking_model.Initializer

Bases: object

Initializer key strings.

CONSTANT = 'constant'
class ultra.ranking_model.base_ranking_model.NormalizationFunctions

Bases: object

Normalization Functions key strings.

BATCH = 'batch'
LAYER = 'layer'
ultra.ranking_model.base_ranking_model.selu(x)
Create the scaled exponential linear unit (SELU) activation function. More information can be found in

Klambauer, G., Unterthiner, T., Mayr, A. and Hochreiter, S., 2017. Self-normalizing neural networks. In Advances in neural information processing systems (pp. 971-980).

Parameters

x – (tf.Tensor) A tensor containing a set of numbers

Returns

The tf.Tensor produced by applying SELU on each element in x.

ultra.ranking_model.dnn module

ultra.ranking_model.linear module

Linear map: sum_i(args[i] * W[i]), where W[i] is a variable.

param args

a 2D Tensor or a list of 2D, batch, n, Tensors.

param output_size

int, second dimension of W[i].

param bias

boolean, whether to add a bias term or not.

param bias_initializer

starting value to initialize the bias (default is all zeros).

param kernel_initializer

starting value to initialize the weight.

returns

A 2D Tensor with shape [batch, output_size] equal to sum_i(args[i] * W[i]), where W[i]s are newly created matrices.

raises ValueError

if some of the arguments has unspecified or wrong shape.

Module contents

ultra.ranking_model.list_available()
Return type

list