12月7日 周天异教授学术报告(智慧教育学院)

时间:2017-12-06浏览:37设置

报告人:周天异 教授

报告题目:SC2Net: Sparse LSTMsfor Sparse Coding

报告时间:12月7日(周四) 14:30-16:00

报告地点:静远楼908室

主办单位:智慧教育学院、科技处

报告人简介:

周天异,教授,博士,毕业于新加坡南洋理工大学,目前在新加坡科技研究局高性能研究所担任科学家职位,负责与参与各种机器学习的项目包括图像识别,的士路线规划等等。他曾经在美国硅谷的索尼研发中心担任高级研发工程师,负责公司无人车项目的视觉感知部分。发表十多篇国际顶级期刊和会议文章其中包括AAAI, IJCAI, CVPR, TIP 等, 分别于ACML2012 获得最佳海报提名奖,BeyondLabelerworkshop on IJCAI 2016 最佳论文奖。

报告摘要:

The iterative hard-thresholdingalgorithm (ISTA) is one of the most popular optimization solvers to achievesparse codes. However, ISTA suffers from following problems: 1) ISTA employsnon-adaptive updating strategy to learn the parameters on each dimension with afixed learning rate. Such a strategy may lead to inferior performance due tothe scarcity of diversity; 2) ISTA does not incorporate the historicalinformation into the updating rules, and the historical information has beenproven helpful to speed up the convergence. To address these challengingissues, we propose a novel formulation of ISTA (named as adaptive ISTA) byintroducing a novel adaptive momentum vector. To efficiently solve the proposedadaptive ISTA, we recast it as a recurrent neural network unit and show its connectionwith the well-known long short term memory (LSTM) model. With a new proposedunit, we present a neural network (termed SC2Net) to achieve sparse codes in anend-to-end manner. To the best of our knowledge, this is one of the first worksto bridge the L1-solver and LSTM, and may provide novel insights inunderstanding model-based optimization and LSTM. Extensive experiments show theeffectiveness of our method on both unsupervised and supervised tasks.


返回原图
/