Neural Tensor Networks with Diagonal Slice Matrices
1751007
Although neural tensor networks (NTNs) have been successful in many natural language processing tasks,
they require a large number of parameters to be estimated, which often results in overfitting and long training times.
We address these issues by applying eigendecomposition to each slice matrix of a tensor to reduce the number of parameters.
We evaluate our proposed NTN models in two tasks.
First, the proposed models are evaluated in a knowledge graph completion task.
Second, a recursive NTN (RNTN) extension of the proposed models is evaluated on a logical reasoning task.
The experimental results show that our proposed models learn better and faster than the original (R)NTNs.