boxes.modules module

class boxes.modules.BoxEmbedding(num_embeddings: int, box_embedding_dim: int, box_type='SigmoidBoxTensor', weight: torch.FloatTensor = None, padding_index: int = None, trainable: bool = True, max_norm: float = None, norm_type: float = 2.0, scale_grad_by_freq: bool = False, sparse: bool = False, vocab_namespace: str = None, pretrained_file: str = None, init_interval_center=0.25, init_interval_delta=0.1)

Bases: allennlp.modules.token_embedders.embedding.Embedding

property all_boxes
box_types = {'BoxTensor': <class 'boxes.box_wrapper.BoxTensor'>, 'DeltaBoxTensor': <class 'boxes.box_wrapper.DeltaBoxTensor'>, 'MinDeltaBoxesOnTorus': <class 'boxes.box_wrapper.MinDeltaBoxesOnTorus'>, 'SigmoidBoxTensor': <class 'boxes.box_wrapper.SigmoidBoxTensor'>}
forward(inputs: torch.LongTensor)

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

get_bounding_box()boxes.box_wrapper.BoxTensor
get_volumes(temp: Union[float, torch.Tensor]) → torch.Tensor
init_weights()
class boxes.modules.LSTMBox(*args, box_type='SigmoidBoxes', **kwargs)

Bases: torch.nn.modules.rnn.LSTM

Module with standard lstm at the bottom but Boxes at the output

forward(inp: torch.Tensor, hx: Optional[Tuple[torch.Tensor, torch.Tensor]] = None) → Tuple[TBoxTensor, Tuple[torch.Tensor, torch.Tensor]]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class boxes.modules.PytorchSeq2BoxWrapper(module: torch.nn.modules.rnn.RNNBase, box_type='SigmoidBoxes')

Bases: allennlp.modules.seq2vec_encoders.pytorch_seq2vec_wrapper.PytorchSeq2VecWrapper

AllenNLP compatible seq to box module

forward(inp: torch.Tensor, mask: Optional[torch.Tensor] = None, hidden_state: Optional[torch.Tensor] = None) → str

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

get_output_dim(after_box: bool = False) → int
boxes.modules.mask_from_lens(*args, **kwargs)