Sho Takase, profile picture

Sho Takase

Sort by
Transformerを多層にする際の勾配消失問題と解決法について
ニューラルネットワークを用いた自然言語処理
NeurIPS2020参加報告
STAIR Lab Seminar 202105
Rethinking Perturbations in Encoder-Decoders for Fast Training
Robust Neural Machine Translation with Doubly Adversarial Inputs
Breaking the Softmax Bottleneck via Learnable Monotonic Pointwise Non-linearities
Enriching Word Vectors with Subword Information
Harnessing Deep Neural Networks with Logic Rules
4thNLPDL
Learning Composition Models for Phrase Embeddings
Retrofitting Word Vectors to Semantic Lexicons
NLP2015 構成性に基づく関係パタンの意味計算
Lexical Inference over Multi-Word Predicates