모바일 메뉴 닫기
 

연구

Research & Laboratory

제목
세미나 [04/09] Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks
작성일
2019.04.08
작성자
전기전자공학부
게시글 내용

< BK21 플러스 BEST 정보기술 사업단 세미나 개최 안내 > 


개최일시 : 2019년 4월 9일 (화) 11:00 ~ 12:00

개최장소 : 제 4공학관D405호

세미나 제목 : Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks

내용 :

Many machine learning tasks such as multiple instance learning, 3D shape recognition, and few-shot image classification are defined on sets of instances. Since solutions to such problems do not depend on the order of elements of the set, models used to address them should be permutation invariant. We present an attention-based neural network module, the Set Transformer, specifically designed to model interactions among elements in the input set. The model consists of an encoder and a decoder, both of which rely on attention mechanisms. In an effort to reduce computational complexity, we introduce an attention scheme inspired by inducing point methods from sparse Gaussian process literature. It reduces the computation time of self-attention from quadratic to linear in the number of elements in the set. We show that our model is theoretically attractive and we evaluate it on a range of tasks, demonstrating the state-of-the-art performance compared to recent methods for set-structured data.



강연자 성함&직함 / 소속 : 이주호 박사, Research Scientist / AITRICS

초청자 : 전기전자공학과 교수 김한준