{"id":17,"date":"2011-04-02T11:46:33","date_gmt":"2011-04-02T02:46:33","guid":{"rendered":"http:\/\/sara.lrlab\/wordpress\/?page_id=17"},"modified":"2026-04-27T23:22:11","modified_gmt":"2026-04-27T14:22:11","slug":"%e5%a5%a5%e6%9d%91%e3%83%bb%e9%ab%98%e6%9d%91%e7%a0%94%e7%a9%b6%e5%ae%a4%e3%83%9b%e3%83%bc%e3%83%a0%e3%83%9a%e3%83%bc%e3%82%b8","status":"publish","type":"page","link":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/","title":{"rendered":"\u5965\u6751\u30fb\u8239\u8d8a\u7814\u7a76\u5ba4"},"content":{"rendered":"<style>.lrmail:after{content: \"@lr.pi.titech.ac.jp\";}<br \/>\n.pimail:after {<br \/>\n  content: \"@pi.titech.ac.jp\";<br \/>\n}<br \/>\n<\/style>\n<p>\u65e5\u672c\u8a9e\/<a href=\"http:\/\/lr-www.pi.titech.ac.jp\/wp_en\/\">English<\/a><\/p>\n<p>\u5965\u6751\u30fb\u8239\u8d8a\u7814\u7a76\u5ba4\u3067\u306f\uff0c\u3053\u3068\u3070\u3092\u8a08\u7b97\u6a5f\u3067\u51e6\u7406\u3059\u308b\u6280\u8853(<strong>\u81ea\u7136\u8a00\u8a9e\u51e6\u7406<\/strong>)\u306b\u95a2\u3059\u308b\u7814\u7a76\u3068\uff0c\u305d\u306e\u6280\u8853\u3092\u7528\u3044\u305f\u5fdc\u7528\u30b7\u30b9\u30c6\u30e0\u306e\u958b\u767a\u3092\u884c\u306a\u3063\u3066\u3044\u307e\u3059\uff0e<br \/>\n\u3053\u3068\u3070\u306e\u7406\u89e3\u3068\u3044\u3046\u30c6\u30fc\u30de\u3067\u306f\uff0c\u96e3\u3057\u3044\u3068\u3055\u308c\u308b\uff0c\u610f\u5473\uff0c\u6587\u8108\u7406\u89e3\u306b\u95a2\u3059\u308b\u7814\u7a76\u3092\u4e2d\u5fc3\u306b\u884c\u306a\u3063\u3066\u3044\u307e\u3059\uff0e\u305d\u308c\u3068\u540c\u6642\u306b\uff0c\u4e16\u306e\u4e2d\u3067\u5f79\u306b\u7acb\u3064\u30b7\u30b9\u30c6\u30e0\u306e\u958b\u767a\u3082\u884c\u306a\u3063\u3066\u3044\u307e\u3059\uff0e\u5177\u4f53\u7684\u306b\u306f\uff0c\u30c6\u30ad\u30b9\u30c8\u8981\u7d04\uff0c\u4eba\u3005\u306e\u610f\u898b\uff0c\u611f\u60c5\u3092\u5206\u6790\u3059\u308b\u8a55\u5224\u5206\u6790\uff0c\u30bd\u30fc\u30b7\u30e3\u30eb\u30e1\u30c7\u30a3\u30a2\u3092\u5bfe\u8c61\u3068\u3057\u305f\u30c6\u30ad\u30b9\u30c8\u30de\u30a4\u30cb\u30f3\u30b0\uff0c\u4eba\u3068\u30c6\u30ad\u30b9\u30c8\u3084\u97f3\u58f0\u3067\u3084\u308a\u3068\u308a\u3059\u308b\u5bfe\u8a71\u30b7\u30b9\u30c6\u30e0\u306a\u3069\u306b\u95a2\u3059\u308b\u30b7\u30b9\u30c6\u30e0\u3092\u958b\u767a\u3057\u3066\u3044\u307e\u3059\uff0e<\/p>\n<p>\u5965\u6751\u6559\u6388\u306f2027\u5e743\u6708\u306b\u5b9a\u5e74\u9000\u8077\u3092\u8fce\u3048\u307e\u3059\u306e\u3067\uff0c\u7814\u7a76\u5ba4\u306b\u95a2\u5fc3\u3092\u6301\u3063\u3066\u304f\u3060\u3055\u3063\u3066\u3044\u308b\u65b9\u3005\u306f\uff0c\u8239\u8d8a\u5148\u751f\u306b\u304a\u554f\u3044\u5408\u308f\u305b\u304f\u3060\u3055\u3044\uff0e<\/p>\n<p><strong>2026\/5\/17\uff08\u65e5\uff09 15:00 \u3088\u308a\u7814\u7a76\u5ba4\u8aac\u660e\u4f1a\u3092\u884c\u3044\u307e\u3059\uff0e\u53c2\u52a0\u3092\u5e0c\u671b\u3055\u308c\u308b\u65b9\u306f\uff0c\u4e8b\u524d\u306b\u8239\u8d8a\u306b\u30e1\u30fc\u30eb\u3067\u9023\u7d61\u3092\u304f\u3060\u3055\u3044\uff0e\u306a\u304a\u5f53\u65e5\u306f\u3059\u305a\u304b\u3051\u30b5\u30a4\u30a8\u30f3\u30b9\u30c7\u30a4\u306e\u958b\u50ac2\u65e5\u76ee\u3067\u3059\uff0e<\/strong><\/p>\n<p><a href=\"http:\/\/lr-www.pi.titech.ac.jp\/wp\/?page_id=4\">\u4e3b\u306a\u7814\u7a76\u30c6\u30fc\u30de<\/a> | <a href=\"http:\/\/lr-www.pi.titech.ac.jp\/wp\/?page_id=506\">\u7814\u7a76\u5ba4\u306b\u52a0\u308f\u308a\u305f\u3044\u7686\u3055\u3093\u3078<\/a> | <a href=\"http:\/\/lr-www.pi.titech.ac.jp\/wp\/?page_id=77#access\">\u30a2\u30af\u30bb\u30b9<\/a><\/p>\n<h2>\u65b0\u7740\u60c5\u5831<\/h2>\n<ul>\n<li style=\"list-style-type: none;\">\n<ul>\n<li><strong>2026\/4\/6<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c The 64th Annual Meeting of the Association for Computational Linguistics\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Ailiang Lin, zhuoyun li, Yusong Wang, Kotaro Funakoshi, Manabu Okumura<\/strong><br \/>\nCausal2Vec: Improving Decoder-only LLMs as Embedding Models through a Contextual Token<\/li>\n<li><strong>2026\/4\/6<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c Findings of The 64th Annual Meeting of the Association for Computational Linguistics\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Ailiang Lin, zhuoyun li, MAO KEYU, Kotaro Funakoshi, Manabu Okumura<\/strong><br \/>\nEmbedding-based In-Context Prompt Training for Enhancing LLMs as Text Encoders<br \/>\n<strong>Sungwoo Han, Sangjun Moon, Jingun Kwon, Hidetaka Kamigaito, Manabu Okumura<\/strong><br \/>\nMeasuring Watermarking under Jailbreaking: ASR Inflation and Goal-Compliance Mismatch<br \/>\n<strong>Riza Setiawan Soetedjo, Yusuke Sakai, Hidetaka Kamigaito, Jingun Kwon, Manabu Okumura, Taro Watanabe<\/strong><br \/>\nEnhancing Factuality through Consensus and Consistency in Summarization Using Minimum Bayes Risk Decoding<\/li>\n<li><strong>2026\/3\/18<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c \u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u7b2c267\u56de\u81ea\u7136\u8a00\u8a9e\u51e6\u7406\u7814\u7a76\u4f1a\u3067\u512a\u79c0\u7814\u7a76\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u6d45\u4e95\u6176\u6717, \u5cb8\u672c\u88d5, \u5c0f\u5c3e\u8ce2\u751f, \u5c0f\u6749\u54f2, \u8239\u8d8a\u5b5d\u592a\u90ce, \u5965\u6751\u5b66<\/strong><br \/>\n\u5927\u898f\u6a21\u8a00\u8a9e\u30e2\u30c7\u30eb\u306b\u304a\u3051\u308b\u30d7\u30ed\u30f3\u30d7\u30c8\u77e5\u8b58\u30ab\u30c3\u30c8\u30aa\u30d5\u6027\u80fd\u306e\u691c\u8a3c<\/li>\n<li><strong>2026\/3\/12<\/strong> \u4ee5\u4e0b\u306e\u767a\u8868\u304c\uff0c\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c32\u56de\u5e74\u6b21\u5927\u4f1a(NLP2026)\u3067\u59d4\u54e1\u7279\u5225\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u738b \u7565\u4e1e, \u5c3e\u5d0e \u614e\u592a\u90ce, \u4e0a\u57a3\u5916 \u82f1\u525b, \u6797 \u514b\u5f66, Jingun Kwon, \u5965\u6751 \u5b66, \u6e21\u8fba \u592a\u90ce<\/strong><br \/>\n\u753b\u50cf\u751f\u6210\u30e2\u30c7\u30eb\u306b\u304a\u3051\u308b\u76f4\u55a9\u55a9\u4f53\u306e\u751f\u6210\u6319\u52d5\u5206\u6790<br \/>\n<strong>\u6751\u4e0a \u8061\u4e00\u6717, \u4e0a\u57a3\u5916 \u82f1\u525b, \u9ad8\u6751 \u5927\u4e5f, \u5965\u6751 \u5b66<\/strong><br \/>\n\u500b\u5225\u9078\u597d\u306e\u7570\u8cea\u6027\u3092\u8003\u616e\u3057\u305f\u5927\u559c\u5229\u30e6\u30fc\u30e2\u30a2\u9078\u597d\u8981\u56e0\u306e\u5206\u6790<\/li>\n<li><strong>2026\/2\/13<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c 2026 International Conference on Language Resources and Evaluation (LREC2026)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>HYEYEON KIM, Sungwoo Han, Jingun Kwon, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nMMCIG: Multimodal Cover Image Generation for Text-only Documents and Its Dataset Construction via Pseudo-labeling<\/li>\n<li><strong>2026\/1\/26<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c The Fourteenth International Conference on Learning Representations (ICLR 2026)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Yicheng Xu, Yue Wu, Jiashuo Yu, Ziang Yan, Tianxiang Jiang, Yinan He, Qingsong Zhao, Kai Chen, Yu Qiao, Limin Wang, Manabu Okumura, Yi Wang<\/strong><br \/>\nExpVid: A Benchmark for Experiment Video Understanding &amp; Reasoning<\/li>\n<li><strong>2026\/1\/4<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c 19th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2026)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Ye Xiong, Hidetaka Kamigaito, Soichiro Murakami, Peinan Zhang, Hiroya Takamura, Manabu Okumura<\/strong><br \/>\nProgressive Visual Refinement for Multi-modal Summarization<\/li>\n<li><strong>2026\/1\/4<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c Findings of 19th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2026)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Juseon Do, Sungwoo Han, Jingun Kwon, Hidetaka Kamigaito, Manabu Okumura<\/strong><br \/>\nConRAS: Contrastive In-context Learning Framework for Retrieval-Augmented Summarization<br \/>\n<strong>LI XIAO, Kotaro Funakoshi, Manabu Okumura<\/strong><br \/>\nEmotion Recognition in Multi-Speaker Conversations through Speaker Identification, Knowledge Distillation, and Hierarchical Fusion<\/li>\n<li><strong>2025\/9\/19<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c The Thirty-ninth Annual Conference on Neural Information Processing Systems (NeurIPS 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Dongyuan Li, Shiyin Tan, Ying Zhang, Ming Jin, Shirui Pan, Manabu Okumura, Renhe Jiang<\/strong><br \/>\nDyG-Mamba: Continuous State Space Modeling on Dynamic Graphs<\/li>\n<li><strong>2025\/8\/21<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c Findings of The 2025 Conference on Empirical Methods in Natural Language Processing (EMNLP 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Sangjun Moon, Dasom choi, Jingun Kwon, Hidetaka Kamigaito, Manabu Okumura<\/strong><br \/>\nLength Representations in Large Language Models<\/li>\n<li><strong>2025\/8\/5<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c31st SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2025)\u3067BEST Student Paper Award\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f.<br \/>\n<strong>Shiyin Tan, Dongyuan Li, Renhe Jiang, Zhen Wang, Xingtong Yu, Manabu Okumura<\/strong><br \/>\nTaming Recommendation Bias with Causal Intervention on Evolving Personal Popularity<\/li>\n<li><strong>2025\/7\/22<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c IEEE Robotics and Automation Letters (RA-L) \u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Takao Obi, Kotaro Funakoshi<\/strong><br \/>\nBreathe and Speak Attentively: Implementing Respiratory Awareness into Conversational Robots<\/li>\n<li><strong>2025\/7\/8<\/strong> \u672c\u7814\u7a76\u5ba4\u304b\u3089\u4f50\u3005\u6728\u88d5\u591a\uff0c\u8239\u8d8a\u5b5d\u592a\u90ce\u304c\u904b\u55b6\u306b\u53c2\u52a0\u3057\u305f\u7b2c7\u56de\u5bfe\u8a71\u30b7\u30b9\u30c6\u30e0\u30e9\u30a4\u30d6\u30b3\u30f3\u30da\u30c6\u30a3\u30b7\u30e7\u30f3\u306b\u95a2\u3057\u3066\uff0c\u4ee5\u4e0b\u306e\u5171\u8457\u8ad6\u6587\u304c The 26th Annual Meeting of the Special Interest Group on Discourse and Dialogue (SIGDIAL 2025) \u306b\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Takahashi et al.<\/strong>, Analyzing Dialogue System Behavior in a Specific Situation Requiring Interpersonal Consideration<br \/>\n<strong>Sato et al.<\/strong>, Key Challenges in Multimodal Task-Oriented Dialogue Systems: Insights from a Large Competition-Based Dataset<\/li>\n<li><strong>2025\/5\/24<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cACL 2025 System Demonstration Track\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong> Yuichiro Mori, Chikara Tanaka, Aru Maekawa, Satoshi Kosugi, Tatsuya Ishigaki, Kotaro Funakoshi, Hiroya Takamura, Manabu Okumura<br \/>\n<\/strong>Live Football Commentary System Providing Background Information<\/li>\n<li><strong>2025\/5\/15<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cThe 63rd Annual Meeting of the Association for Computational Linguistics (ACL 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong> Boxuan Lyu, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>Unveiling the Power of Source: Source-based Minimum Bayes Risk Decoding for Neural Machine Translation<\/li>\n<li><strong>2025\/5\/15<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cFindings of The 63rd Annual Meeting of the Association for Computational Linguistics (ACL 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong> Soichiro Murakami, Peinan Zhang, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura<br \/>\n<\/strong>AdParaphrase v2.0: Generating Attractive Ad Texts Using a Preference-Annotated Paraphrase Dataset<\/li>\n<li><strong>2025\/5\/15<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c31st SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong> Shiyin Tan, Dongyuan Li, Renhe Jiang, Zhen Wang, Xingtong Yu, Manabu Okumura<br \/>\n<\/strong>Taming Recommendation Bias with Causal Intervention on Evolving Personal Popularity<\/li>\n<li><strong>2025\/4\/5<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c48th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong> Shiyin Tan, Jaeeon Park, Dongyuan Li, Renhe Jiang and Manabu Okumura<br \/>\n<\/strong>A Unified Retrieval Framework with Document Ranking and EDU Filtering for Multi-document Summarization<\/li>\n<li><strong>2025\/4\/1<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cInternational Joint Conference on Neural Networks (IJCNN2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Yiqian Huang, Ying Zhang, Kotaro Funakoshi, Manabu Okumura, Yang Cao<br \/>\n<\/strong>S2N: A Synthetic Data-Driven Approach for Speaker-to-Dialogue Attribution in Novels<\/li>\n<li><strong>2025\/3\/21<\/strong> \u5ca1\u5c71\u770c\u7acb\u5927\u5b66\u5ca9\u6a4b\u7814\u7a76\u5ba4\u3068\u672c\u7814\u7a76\u5ba4\uff08\u4f50\u3005\u6728\u88d5\u591a\uff0c\u5c0f\u5c3e\u8ce2\u751f\uff0c\u8239\u8d8a\u5b5d\u592a\u90ce\uff09\u306e\u30c1\u30fc\u30e0 TabiToc \u304c,\u7b2c7\u56de\u5bfe\u8a71\u30b7\u30b9\u30c6\u30e0\u30e9\u30a4\u30d6\u30b3\u30f3\u30da\u30c6\u30a3\u30b7\u30e7\u30f3 \u30b7\u30c1\u30e5\u30a8\u30fc\u30b7\u30e7\u30f3\u30c8\u30e9\u30c3\u30af\uff0c\u30bf\u30b9\u30af\u30c8\u30e9\u30c3\u30af\u306e\u4e21\u30c8\u30e9\u30c3\u30af\u3067\u6700\u512a\u79c0\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<\/li>\n<li><strong>2025\/3\/13<\/strong> \u4ee5\u4e0b\u306e\u767a\u8868\u304c\uff0c\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c31\u56de\u5e74\u6b21\u5927\u4f1a(NLP2025)\u3067\u82e5\u624b\u5968\u52b1\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u4e2d\u6839 \u7a1c\u4ecb, \u524d\u5ddd \u5728, \u4e0a\u57a3\u5916 \u82f1\u525b, \u5e73\u5c3e \u52aa, \u5965\u6751 \u5b66<\/strong><br \/>\n\u5927\u898f\u6a21\u8a00\u8a9e\u30e2\u30c7\u30eb\u3092\u7528\u3044\u305f\u30b7\u30d5\u30c8\u9084\u5143\u578b\u53e5\u69cb\u9020\u89e3\u6790<\/li>\n<li><strong>2025\/1\/23<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cThe Thirteenth International Conference on Learning Representations (ICLR 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Jian Wu, Linyi Yang, Zhen Wang, Manabu Okumura, Yue Zhang<br \/>\n<\/strong>CofCA: A STEP-WISE Counterfactual Multi-hop QA benchmark<\/li>\n<li><strong>2025\/1\/23<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cThe Thirteenth International Conference on Learning Representations (ICLR 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Jian Wu, Linyi Yang, Dongyuan Li, Yuliang Ji, Manabu Okumura, Yue Zhang<br \/>\n<\/strong>MMQA: Evaluating LLMs with Multi-Table Multi-Hop Complex Questions<\/li>\n<li><strong>2025\/1\/23<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cFindings of The 2025 Annual Conference of the Nations of the Americas Chapter of the ACL (NAACL 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Juseon-Do, Jaesung Hwang, Jingun Kwon, Hidetaka Kamigaito, Manabu Okumura<br \/>\n<\/strong>Considering Length Diversity in Retrieval-Augmented Summarization<\/li>\n<li><strong>2025\/1\/23<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cFindings of The 2025 Annual Conference of the Nations of the Americas Chapter of the ACL (NAACL 2025)\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Soichiro Murakami, Peinan Zhang, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura<br \/>\n<\/strong>AdParaphrase: Paraphrase Dataset for Analyzing Linguistic Features toward Generating Attractive Ad Texts<\/li>\n<li><strong>2025\/1\/21<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304cThe Web Conference 2025\u306b\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Dongyuan Li, Satoshi Kosugi, Ying Zhang, Manabu Okumura, Feng Xia, Renhe Jiang<br \/>\n<\/strong>Revisiting Dynamic Graph Clustering via Matrix Factorization<\/li>\n<li><strong>2024\/12\/9<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c38th Pacific Asia Conference on Language, Information and Computation (PACLIC 2024) \u3067Best Paper Award\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong> Chun-Fang Chuang, Dongyuan Li, Satoshi Kosugi, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>LPLS: A Selection Strategy Based on Pseudo-Labeling Status for Semi-Supervised Active Learning in Text Classification<\/li>\n<li><strong>2024\/10\/16<\/strong> 38th Pacific Asia Conference on Language, Information and Computation (PACLIC 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Yuanyuan Cai, Satoshi Kosugi, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>Enhancing Image Clustering with Captions<\/li>\n<li><strong>2024\/10\/16<\/strong> 38th Pacific Asia Conference on Language, Information and Computation (PACLIC 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Chun-Fang Chuang, Dongyuan Li, Satoshi Kosugi, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>LPLS: A Selection Strategy Based on Pseudo-Labeling Status for Semi-Supervised Active Learning in Text Classification<\/li>\n<li><strong>2024\/9\/26<\/strong> The Thirty-eighth Annual Conference on Neural Information Processing Systems (NeurIPS 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Yicheng Xu, Yuxin Chen, Jiahao Nie, Yusong Wang, Huiping Zhuang, Manabu Okumura<br \/>\n<\/strong>Advancing Cross-domain Discriminability in Continual Learning of Vison-Language Models<\/li>\n<li><strong>2024\/9\/20<\/strong> Findings of The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Dongyuan Li, Ying Zhang, Zhen Wang, Shiyin Tan, Satoshi Kosugi, Manabu Okumura<br \/>\n<\/strong>Active Learning for Abstractive Text Summarization via LLM-Determined Curriculum and Certainty Gain Maximization<\/li>\n<li><strong>2024\/9\/20<\/strong> Findings of The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Tsutomu Hirao, Naoki Kobayashi, Hidetaka Kamigaito, Manabu Okumura, Akisato Kimura<br \/>\n<\/strong>Video Discourse Parsing and Its Application to Multimodal Summarization: A Dataset and Baseline Approaches<\/li>\n<li><strong>2024\/9\/20<\/strong> Findings of The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Yusong Wang, Dongyuan Li, Jialun Shen, Yicheng Xu, Mingkun Xu, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>LAMBDA: Large Language Model-Based Data Augmentation for Multi-Modal Machine Translation<\/li>\n<li><strong>2024\/9\/17<\/strong> \u6226\u7565\u7684\u5275\u9020\u7814\u7a76\u63a8\u9032\u4e8b\u696d\uff08\uff21\uff23\uff34\uff0d\uff38\uff09 \u306b\u4ee5\u4e0b\u306e\u7814\u7a76\u8ab2\u984c\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> \u5c0f\u6749\u54f2<br \/>\n<\/strong>\u8996\u899a\u7684\u8aac\u660e\u304c\u53ef\u80fd\u306a\u5bfe\u8a71\u7684\u753b\u50cf\u751f\u6210\u624b\u6cd5<\/li>\n<li><strong>2024\/8\/27<\/strong> PRICAI 2024 \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Yusong Wang, Ying Zhang, Dongyuan Li, Jialun Shen, Yicheng Xu, Mingkun Xu, Kotaro Funakoshi and Manabu Okumura<br \/>\n<\/strong>FINE-LMT: Fine-grained Feature Learning for Multi-Modal Machine Translation<\/li>\n<li><strong>2024\/8\/1<\/strong> The 12th International Conference on Human-Agent Interaction (HAI'24) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Takao Obi, Kotaro Funakoshi<br \/>\n<\/strong>Can Respiration Make Spoken Interactions Better?<\/li>\n<li><strong>2024\/7\/21<\/strong> 25th Meeting of the Special Interest Group on Discourse and Dialogue (SIGDIAL 2024) (DEMO) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Takao Obi, Kotaro Funakoshi<br \/>\n<\/strong>Using Respiration for Enhancing Human-Robot Dialogue<\/li>\n<li><strong>2024\/7\/15<\/strong> Findings of The 2024 Conference on Empirical Methods in Natural Language Processing (EMNLP 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Dongyuan Li, Ying Zhang, Zhen Wang, Shiyin Tan, Satoshi Kosugi, Manabu Okumura<br \/>\n<\/strong>Active Learning for Abstractive Text Summarization via LLM-Determined Curriculum and Certainty Gain Maximization<\/li>\n<li><strong>2024\/5\/16<\/strong> Findings of The 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Congda MA, Tianyu Zhao, Manabu Okumura<br \/>\n<\/strong>Debiasing Large Language Models with Structured Knowledge<\/li>\n<li><strong>2024\/5\/16<\/strong> Findings of The 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Juseon-Do, Hidetaka Kamigaito, Manabu Okumura, Jingun Kwon<br \/>\n<\/strong>InstructCMP: Length Control in Sentence Compression through Instruction-based Large Language Models<\/li>\n<li><strong>2024\/5\/2<\/strong> Forty-first International Conference on Machine Learning (ICML 2024) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Shiyin Tan, Dongyuan Li, Renhe Jiang, Ying Zhang, Manabu Okumura<br \/>\n<\/strong>Community-Invariant Graph Contrastive Learning<\/li>\n<li><strong>2024\/3\/26<\/strong> The 47th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR) 2024 \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Ye Xiong, Hidetaka Kamigaito, Soichiro Murakami, Peinan Zhang, Hiroya Takamura and Manabu Okumura<br \/>\n<\/strong>Grasping Both Query Relevance and Essential Content or Query-focused Summarization<\/li>\n<li><strong>2024\/3\/16<\/strong> the 2024 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Toshiki Kawamoto, Masaki Tashiro, Takamichi Nakamoto, Manabu Okumura<br \/>\n<\/strong>Creating Linguistic Embedding Space for Odors<\/li>\n<li><strong>2024\/3\/14<\/strong> \u4ee5\u4e0b\u306e\u767a\u8868\u304c\uff0c\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c30\u56de\u5e74\u6b21\u5927\u4f1a(NLP2024)\u3067\u82e5\u624b\u5968\u52b1\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u524d\u5ddd\u5728, \u5c0f\u6749\u54f2, \u8239\u8d8a\u5b5d\u592a\u90ce, \u5965\u6751\u5b66<\/strong><br \/>\n\u30c6\u30ad\u30b9\u30c8\u751f\u6210\u30e2\u30c7\u30eb\u3092\u5229\u7528\u3057\u305f\u30c7\u30fc\u30bf\u30bb\u30c3\u30c8\u84b8\u7559<\/li>\n<li><strong>2024\/3\/14<\/strong> \u4ee5\u4e0b\u306e\u767a\u8868\u304c\uff0c\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c30\u56de\u5e74\u6b21\u5927\u4f1a(NLP2024)\u3067\u59d4\u54e1\u7279\u5225\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u68ee\u96c4\u4e00\u90ce, \u524d\u5ddd\u5728, \u5c0f\u6749\u54f2, \u8239\u8d8a\u5b5d\u592a\u90ce, \u9ad8\u6751\u5927\u4e5f, \u5965\u6751\u5b66<\/strong><br \/>\n\u30b5\u30c3\u30ab\u30fc\u5b9f\u6cc1\u4e2d\u7d99\u3092\u4ed8\u52a0\u7684\u60c5\u5831\u306e\u63d0\u4f9b\u3068\u3044\u3046\u5074\u9762\u304b\u3089\u898b\u308b<\/li>\n<li><strong>2024\/3\/14<\/strong> Findings of 2024 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2024)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Aru Maekawa, Satoshi Kosugi, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>DiLM: Distilling Dataset into Language Model for Text-level Dataset Distillation<\/li>\n<li><strong><strong>2024\/1\/18<\/strong> The 18th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2024)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Boonnithi Jiaramaneepinit, Thodsaporn Chay-intr, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>Extreme Fine-tuning: A Novel and Fast Fine-tuning Approach<\/strong><\/li>\n<li><strong>2024\/1\/18<\/strong> The 18th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2024)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Aru Maekawa, Tsutomu Hirao, Hidetaka Kamigaito, Manabu Okumura<br \/>\n<\/strong>Can we obtain significant success in RST discourse parsing by using Large Language Models?<\/li>\n<li><strong>2024\/1\/11<\/strong> the 2024 ACM\/IEEE International Conference on Human-Robot Interaction (HRI\u201924) \u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Takao Obi, Kotaro Funakoshi<br \/>\n<\/strong>Respiration-enhanced Human-Robot Communication<\/li>\n<li><strong>2023\/12\/14<\/strong> \u4ee5\u4e0b\u306e\u767a\u8868\u304c\uff0c\u7b2c14\u56de\u5bfe\u8a71\u30b7\u30b9\u30c6\u30e0\u30b7\u30f3\u30dd\u30b8\u30a6\u30e0\uff08\u7b2c99\u56de \u4eba\u5de5\u77e5\u80fd\u5b66\u4f1a \u8a00\u8a9e\u30fb\u97f3\u58f0\u7406\u89e3\u3068\u5bfe\u8a71\u51e6\u7406\u7814\u7a76\u4f1a\uff09\u3067\u82e5\u624b\u512a\u79c0\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u5c0f\u5c3e \u8ce2\u751f\u3001\u8239\u8d8a \u5b5d\u592a\u90ce<\/strong><br \/>\n\u89aa\u548c\u7684\u306a\u5bfe\u8a71\u30b7\u30b9\u30c6\u30e0\u306e\u5b9f\u73fe\u306b\u3080\u3051\u305f\u975e\u63a5\u89e6\u547c\u5438\u63a8\u5b9a\u6280\u8853\u306e\u958b\u767a<\/li>\n<li><strong>2023\/12\/14<\/strong> 2024 IEEE International Conference on Acoustics, Speech and Signal Processing<br \/>\n(IEEE ICASSP 2024)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Zhen Wang, Dongyuan Li, Manabu Okumura<br \/>\n<\/strong>MULTIMODAL GRAPH-BASED AUDIO-VISUAL EVENT LOCALIZATION<\/li>\n<li><strong>2023\/11\/17<\/strong> 30th International Conference on Multi Media Modeling (MMM 2024)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Zhen Wang, Peide Zhu, Fuyang Yu and Manabu Okumura<br \/>\n<\/strong>A New Benchmark and OCR-free Method for Document Image Topic Classification<\/li>\n<li><strong>2023\/11\/17<\/strong> 30th International Conference on Multi Media Modeling (MMM 2024)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Peide Zhu, Zhen Wang, Manabu Okumura and Jie Yang<br \/>\n<\/strong>MRHF: Multi-stage Retrieval and Hierarchical Fusion for Textbook Question Answering<\/li>\n<li><strong>2023\/11\/17<\/strong> 30th International Conference on Multi Media Modeling (MMM 2024)\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Fuyang Yu, Zhen Wang, Dongyuan Li, Peide Zhu, Xiaochuan Wang, Xiaohui Liang and Manabu Okumura<br \/>\n<\/strong>Towards Cross-modal Point Cloud Retrieval for Indoor Scenes<\/li>\n<li><strong>2023\/10\/8<\/strong> the 2023 Conference on Empirical Methods in Natural Language Processing\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Dongyuan Li, Yusong Wang, Kotaro Funakoshi, Manabu Okumura<br \/>\n<\/strong>Joyful: Joint Modality Fusion and Graph Contrastive Learning for Multimodal Emotion Recognition<\/li>\n<li><strong>2023\/10\/3<\/strong>  iSAI-NLP 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Wuttinan Longjaroen, Thodsaporn Chay-Intr, Kotaro Funakoshi, Ananlada Chotimongkol and Sasiporn Usanavasin<br \/>\n<\/strong>Simple2In1: A Simple Method for Fusing Two Sequences from Different Captioning Systems into One Sequence for a Small-scale Thai Dataset<\/li>\n<li><strong>2023\/9\/22<\/strong> IEEE Workshop on Automatic Speech Recognition and Understanding 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Dongyuan Li, Yusong WANG, Kotrao Funakoshi, Manabu Okumura<br \/>\n<\/strong>Active Learning Based Fine-Tuning Framework for Speech Emotion Recognition<\/li>\n<li><strong>2023\/9\/8<\/strong> PACLIC 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Toshiki Kawamoto, Yuki Okano, Takato Yamazaki, Toshinori Sato, Kotaro Funakoshi and Manabu Okumura<br \/>\n<\/strong>A Follow-up Study on Evaluation Metrics Using Follow-up Utterances<\/li>\n<li><strong>2023\/9\/1<\/strong> \u672c\u7814\u7a76\u5ba4OB\u3067\u3042\u308b<a href=\"https:\/\/aclanthology.org\/people\/j\/jingun-kwon\/\">Jingun Kwon<\/a>\u3055\u3093\u304c<a href=\"https:\/\/plus.cnu.ac.kr\/html\/en\/\">Chungnam National University<\/a>\u306eAssistant Professor\u306b\u7740\u4efb\u3055\u308c\u307e\u3057\u305f\uff0e<\/li>\n<li><strong>2023\/8\/5<\/strong> CIKM 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Dongyuan Li, Shiyin Tan, Yusong Wang, Kotaro Funakoshi and Manabu Okumura<\/strong><br \/>\nTemporal and Topological Augmentation-based Cross-view Contrastive Learning Model for Temporal Link Prediction<\/li>\n<li><strong>2023\/7\/21<\/strong> ICMI 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Takao Obi, Kotaro Funakoshi<\/strong><br \/>\nVideo-based Respiratory Waveform Estimation in Dialogue: A Novel Task and Dataset for Human-Machine Interaction<\/li>\n<li><strong>2023\/7\/3<\/strong> RANLP 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Congda Ma, Kotaro Funakoshi, Kiyoaki Shirai and Manabu Okumura<\/strong><br \/>\nCoherent Story Generation with Structured Knowledge<\/li>\n<li><strong>2023\/5\/24<\/strong> BEA 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Yuki Okano, Kotaro Funakoshi, Ryo Nagata, Manabu Okumura<\/strong><br \/>\nGenerating Dialog Responses with Specified Grammatical Items for Second Language Learning<\/li>\n<li><strong>2023\/5\/3<\/strong> the Findings of ACL 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Ying Zhang, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nBidirectional Transformer Reranker for Grammatical Error Correction<\/li>\n<li><strong>2023\/5\/3<\/strong> the Findings of ACL 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Jian Wu, YICHENG XU, Yan Gao, Jian-Guang LOU, Borje F. Karlsson and Manabu Okumura<\/strong><br \/>\nTACR: A Table Alignment-based Cell Selection Method for HybridQA<\/li>\n<li><strong>2023\/5\/3<\/strong> ACL 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Aru Maekawa, Naoki Kobayashi, Kotaro Funakoshi and Manabu Okumura<\/strong><br \/>\nDataset Distillation with Attention Labels for Fine-tuning BERT<\/li>\n<li><strong>2023\/5\/3<\/strong> ACL 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Congda Ma, Tianyu Zhao, Makoto Shing, Kei Sawada and Manabu Okumura<\/strong><br \/>\nFocused Prefix Tuning for Controllable Text Generation<\/li>\n<li><strong>2023\/4\/14<\/strong> ICMR 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong> Yusong Wang, Dongyuan Li, Kataro Funakoshi and Manabu Okumura<\/strong><br \/>\nEMP: Emotion-guided Multi-modal Fusion and Contrastive Learning for Personality Traits Recognition.<\/li>\n<li><strong>2023\/3\/16<\/strong> \u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c29\u56de\u5e74\u6b21\u5927\u4f1a\u59d4\u54e1\u7279\u5225\u8cde\u3092\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u524d\u5ddd\u5728, \u5c0f\u6797\u5c1a\u8f1d, \u8239\u8d8a\u5b5d\u592a\u90ce, \u5965\u6751\u5b66<\/strong><br \/>\n\u4e8b\u524d\u5b66\u7fd2\u6e08\u307fTransformer\u30e2\u30c7\u30eb\u306e\u305f\u3081\u306e\u6ce8\u610f\u6559\u5e2b\u4ed8\u304dFew-shot\u30c7\u30fc\u30bf\u306e\u84b8\u7559<\/li>\n<li><strong>2023\/1\/22<\/strong> EACL 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Aru Maekawa, Hidetaka Kamigaito, Kotaro Funakoshi and Manabu Okumura<\/strong><br \/>\nGenerative Replay inspired by Hippocampal Memory Indexing for Continual Language Learning<\/li>\n<li><strong>2023\/1\/22<\/strong> Findings of EACL 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Jingun Kwon, Hidetaka Kamigaito, Young-In Song and Manabu Okumura<\/strong><br \/>\nHierarchical Label Generation for Text Classification<\/li>\n<li><strong>2023\/1\/22<\/strong> Findings of EACL 2023\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Jingun Kwon, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nAbstractive Document Summarization with Summary-length Prediction<\/li>\n<li><strong>2022\/10\/6<\/strong> Findings of EMNLP 2022\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Naoki Kobayashi, Tsutomu Hirao, Hidetaka Kamigaito, Manabu Okumura and Masaaki Nagata<\/strong><br \/>\nA Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing<\/li>\n<li><strong>2022\/9\/22<\/strong> Findings of AACL-IJCNLP 2022\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Yukun Feng, Amir Fayazi, Abhinav Rastogi and Manabu Okumura<\/strong><br \/>\nEfficient Entity Embedding Construction from Type Knowledge for BERT<\/li>\n<li><strong>2022\/9\/9<\/strong> \u4eba\u5de5\u77e5\u80fd\u5b66\u4f1a\u8ad6\u6587\u8a8c\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>\u8239\u8d8a\u5b5d\u592a\u90ce<\/strong><br \/>\n\u975e\u516c\u7406\u7684\u9805\u8ad6\u7406\uff1a\u8a8d\u77e5\u7684\u8a18\u53f7\u63a8\u8ad6\u306e\u8a08\u7b97\u7406\u8ad6\uff08\u7b2c37\u5dfb6\u53f7\u63b2\u8f09\u4e88\u5b9a\uff09<\/li>\n<li><strong>2022\/8\/29<\/strong> \u8239\u8d8a\u51c6\u6559\u6388\u3089\u306e\u7814\u7a76\u30c1\u30fc\u30e0\u306e\u30c6\u30fc\u30de\u304c\uff0c\u672c\u5b66\u672a\u6765\u793e\u4f1aDESIGN\u6a5f\u69cb\u306e<a href=\"https:\/\/www.titech.ac.jp\/news\/2022\/064672\">\u300c\u672a\u6765\u3092\u5b9f\u73fe\u3059\u308b\u305f\u3081\u306e\u7814\u7a76\u652f\u63f4 DLab Challenge 2022\u300d<\/a>\u306b\u63a1\u629e\u3055\u308c\u307e\u3057\u305f<\/li>\n<li><strong>2022\/8\/17<\/strong> <a href=\"https:\/\/coling2022.org\/\">COLING2022<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Jingyi You, Dongyuan Li, Manabu Okumura and Kenji Suzuki<\/strong><br \/>\nJPG - Jointly Learn to Align: Automated Disease Prediction and Radiology Report Generation<br \/>\n<strong>Dongyuan Li, Jingyi You, Kotaro Funakoshi and Manabu Okumura<\/strong><br \/>\nA-TIP: Attribute-aware Text Infilling via Pre-trained Language Model<br \/>\n<strong>Yidong Wang, Hao Wu, Ao Liu, Wenxin Hou, Zhen Wu, Jindong Wang, Takahiro Shinozaki, Manabu Okumura and Yue Zhang<\/strong><br \/>\nExploiting Unlabeled Data for Target-Oriented Opinion Words Extraction<\/li>\n<li><strong>2022\/4\/8<\/strong> The 29th International Conference on Computational Linguistics <a href=\"https:\/\/2022.naacl.org\/\">NAACL-HLT 2022 Industry Track<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Soichiro Murakami, Peinan Zhang, Sho Hoshino, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura<\/strong><br \/>\nAspect-based Analysis of Advertising Appeals for Search Engine Advertising<\/li>\n<li><strong>2022\/4\/8<\/strong> <a href=\"https:\/\/2022.naacl.org\/\">NAACL2022<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f<br \/>\n<strong>Jingyi You, Dongyuan Li, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura<\/strong><br \/>\nJoint Learning-based Heterogeneous Graph Attention Network for Timeline Summarization<br \/>\n<strong>Toshiki Kawamoto, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura<\/strong><br \/>\nGenerating Repetitions with Appropriate Repeated Words<\/li>\n<li><strong>2022\/3\/17<\/strong> \u4ee5\u4e0b\u306e\u7814\u7a76\u304c\u2f94\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c28\u56de\u5e74\u6b21\u2f24\u4f1a\u3067LegalForce\u8cde\u3092\u53d7\u8cde\u3044\u305f\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u7530\u4ee3\u771f\u751f, \u4e0a\u57a3\u5916\u82f1\u525b, \u8239\u8d8a\u5b5d\u592a\u90ce, \u5965\u6751\u5b66<\/strong><br \/>\n\u5426\u5b9a\u306e\u7406\u89e3\u3078\u306eprompt-based finetuning\u306e\u52b9\u679c<\/li>\n<li><strong>2022\/3\/15<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c<strong>\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a2021\u5e74\u5ea6\u8ad6\u6587\u8cde<\/strong>\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f.<br \/>\n<strong>Hidetaka Kamigaito, Katsuhiko Hayashi, Tsutomu Hirao, Masaaki Nagata and Manabu Okumura<\/strong><br \/>\nEffectiveness of Syntactic Dependency Information for Higher-Order Syntactic Attention Network<br \/>\n<strong>\u6751\u4e0a \u8061\u4e00\u6717\uff0c\u7530\u4e2d \u5929\uff0c\u8429\u884c \u6b63\u55e3\uff0c\u4e0a\u57a3\u5916 \u82f1\u525b\uff0c\u8239\u8d8a \u5b5d\u592a\u90ce\uff0c\u9ad8\u6751 \u5927\u4e5f\uff0c\u5965\u6751 \u5b66<\/strong><br \/>\n\u6570\u5024\u6c17\u8c61\u4e88\u5831\u304b\u3089\u306e\u5929\u6c17\u4e88\u5831\u30b3\u30e1\u30f3\u30c8\u306e\u81ea\u52d5\u751f\u6210<\/li>\n<li><strong>2022\/3\/5<\/strong>&nbsp;<a href=\"https:\/\/www.ipsj.or.jp\/event\/taikai\/84\/index.html\">\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u7b2c84\u56de\u5168\u56fd\u5927\u4f1a<\/a>\u306b\u304a\u3044\u3066\uff0c\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u5b66\u751f\u5968\u52b1\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u5c0f\u5c3e \u8ce2\u751f,\u8239\u8d8a \u5b5d\u592a\u90ce<\/strong><br \/>\nCNN-LSTM\u3092\u7528\u3044\u305f\u975e\u63a5\u89e6\u578b\u547c\u5438\u63a8\u5b9a\u624b\u6cd5\u306e\u958b\u767a<\/li>\n<li><strong>2021\/12\/15<\/strong> <a href=\"https:\/\/www.anlp.jp\/abst\/vol28\/no4.html\">\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Lya Hulliyyatus Suadaa, Hidetaka Kamigaito, Manabu Okumura and Hiroya Takamura<\/strong><br \/>\nMetric-Type Identification for Multilevel Header Numerical Tables in Scientific Papers<br \/>\n<strong>Chenlong Hu, Yukun Feng, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nOne-class Text Classification with Multi-modal Deep Support Vector Data Description<br \/>\n<strong>Abdurrisyad Fikri, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nStylistically User-specific Response Generation<br \/>\n<strong>\u6751\u4e0a \u8061\u4e00\u6717, \u7530\u4e2d\u5929, \u8429\u884c \u6b63\u55e3, \u4e0a\u57a3\u5916 \u82f1\u525b, \u8239\u8d8a \u5b5d\u592a\u90ce, \u9ad8\u6751 \u5927\u4e5f, \u5965\u6751 \u5b66<\/strong><br \/>\n\u6570\u5024\u6c17\u8c61\u4e88\u5831\u304b\u3089\u306e\u5929\u6c17\u4e88\u5831\u30b3\u30e1\u30f3\u30c8\u306e\u81ea\u52d5\u751f\u6210<\/li>\n<li><strong>2021\/10\/4<\/strong>&nbsp;\u60c5\u5831\u51e6\u7406\u5b66\u4f1a<a href=\"https:\/\/www.ipsj.or.jp\/kenkyukai\/event\/nl250.html\">\u7b2c250\u56de\u81ea\u7136\u8a00\u8a9e\u51e6\u7406\u7814\u7a76\u4f1a(NL\u7814)<\/a>\u306b\u304a\u3044\u3066\uff0c\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c<a href=\"https:\/\/www.ipsj.or.jp\/kenkyukai\/event\/nl250.html\">\u82e5\u624b\u5968\u52b1\u8cde<\/a>\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<strong><br \/>\n\u85e4\u7530 \u6b63\u609f,\u4e0a\u57a3\u5916 \u82f1\u525b,\u8239\u8d8a \u5b5d\u592a\u90ce,\u5965\u6751 \u5b66<\/strong><br \/>\n\u62bd\u51fa\u578b\u8907\u6570\u6587\u66f8\u8981\u7d04\u306b\u304a\u3051\u308b\u6587\u9806\u5e8f\u3092\u8003\u616e\u3057\u305f\u8a55\u4fa1<\/li>\n<li><strong>2021\/9\/1<\/strong> \u672c\u7814\u7a76\u5ba4OB\u3067\u3042\u308b<a href=\"https:\/\/researchmap.jp\/hikaruy\">\u6a2a\u91ce\u5149<\/a>\u3055\u3093\u304c<a href=\"http:\/\/www.is.meisei-u.ac.jp\/\">\u660e\u661f\u5927\u5b66\u60c5\u5831\u5b66\u90e8\u60c5\u5831\u5b66\u79d1<\/a>\u51c6\u6559\u6388\u306b\u7740\u4efb\u3055\u308c\u307e\u3057\u305f\uff0e<\/li>\n<li><strong>2021\/8\/26 <a href=\"https:\/\/2021.emnlp.org\/\">EMNLP 2021<\/a><\/strong>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Jingun Kwon, Naoki Kobayashi, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nConsidering Nested Tree Structure in Sentence Extractive Summarization with Pre-trained Transformer<br \/>\n<strong>Ying Zhang, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nA Language Model-based Generative Classifier for Sentence-level Discourse Parsing<\/li>\n<li><strong>2021\/8\/8 <a href=\"https:\/\/www.cikm2021.org\/\">CIKM 2021<\/a><\/strong>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Jingyi You, Chenlong Hu, Hidetaka Kamigaito, Kotaro Funakoshi and Manabu Okumura<\/strong><br \/>\nRobust Dynamic Clustering for Temporal Networks<\/li>\n<li><strong>2021\/7\/28 <a href=\"https:\/\/ranlp.org\/ranlp2021\/start.php\">RANLP 2021<\/a><\/strong>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Jingun Kwon, Naoki Kobayashi, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nMaking Your Tweets More Fancy: Emoji Insertion to Texts<br \/>\n<strong>Yukun Feng, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nImproving Character-Aware Neural Language Model by Warming Up Character Encoder under Skip-gram Architecture<br \/>\n<strong>Thodsaporn Chay-intr, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nCharacter-based Thai Word Segmentation with Multiple Attentions<br \/>\n<strong>Ying Zhang, Hidetaka Kamigaito, Tatsuya Aoki, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nGeneric Mechanism for Reducing Repetitions in Encoder-Decoder Models<br \/>\n<strong>Jingyi You, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nAbstractive Document Summarization with Word Embedding Reconstruction<\/li>\n<li><strong>2021\/5\/6 ACL2021\u306eFindings<\/strong>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Yijin Xiong, Yukun Feng, Hao Wu, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nFusing Label Embedding into BERT: An Efficient Improvement for Text Classification<\/li>\n<li><strong>2021\/5\/6 <a href=\"https:\/\/2021.aclweb.org\/\">ACL2021<\/a><\/strong>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Lya Hulliyyatus Suadaa, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura and Hiroya Takamura<\/strong><br \/>\nTowards Table-to-Text Generation with Numerical Reasoning<br \/>\n<strong>Hidetaka Kamigaito and Katsuhiko Hayashi<\/strong><br \/>\nUnified Interpretation of Softmax Cross-Entropy and Negative Sampling for Knowledge Graph Embedding<\/li>\n<li><strong>2021\/3\/18<\/strong> \u4ee5\u4e0b\u306e\u7814\u7a76\u304c<strong>\u2f94\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c27\u56de\u5e74\u6b21\u2f24\u4f1a\u512a\u79c0\u8cde<\/strong>\u3092\u53d7\u8cde\u3044\u305f\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u2f75\u4e0b\u667a\u7ae0,\u4e0a\u57a3\u5916\u82f1\u525b,\u8239\u8d8a\u5b5d\u592a\u90ce,\u2fbc\u6751\u2f24\u4e5f,\u5965\u6751\u5b66<\/strong><br \/>\n\u95a2\u9023\u30bf\u30b9\u30af\u306e\u4e88\u6e2c\u78ba\u7387\u5206\u5e03\u3092\u2f64\u3044\u308bsoft-gated BERT\u306b\u3088\u308b\u5bfe\u8a71\u5370\u8c61\u5206\u985e<\/li>\n<li><strong>2021\/3\/18<\/strong> \u2f29\u6797\u5c1a\u8f1d\u304c\u4ee5\u4e0b\u306e\u7814\u7a76\u3067<strong>\u2f94\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c27\u56de\u5e74\u6b21\u2f24\u4f1a\u82e5\u2f3f\u5968\u52b1\u8cde<\/strong>\u3092\u53d7\u8cde\u3044\u305f\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u5c0f\u6797\u5c1a\u8f1d,\u5e73\u5c3e\u52aa,\u4e0a\u57a3\u5916\u82f1\u525b,\u5965\u6751\u5b66,\u6c38\u7530\u660c\u660e<\/strong><br \/>\n\u7591\u4f3c\u6b63\u89e3\u30c7\u30fc\u30bf\u3092\u5229\u2f64\u3057\u305f\u4fee\u8f9e\u69cb\u9020\u89e3\u6790\u5668\u306e\u6539\u5584<\/li>\n<li><strong>2021\/3\/18<\/strong> \u4ee5\u4e0b\u306e\u7814\u7a76\u304c<strong>\u2f94\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c27\u56de\u5e74\u6b21\u2f24\u4f1a\u59d4\u54e1\u7279\u5225\u8cde<\/strong>\u3092\u53d7\u8cde\u3044\u305f\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u4e0a\u57a3\u5916\u82f1\u525b,\u6797\u514b\u5f66<\/strong><br \/>\n\u77e5\u8b58\u30b0\u30e9\u30d5\u57cb\u3081\u8fbc\u307f\u5b66\u7fd2\u306b\u304a\u3051\u308b\u640d\u5931\u95a2\u6570\u306e\u7d71\u4e00\u7684\u89e3\u91c8<\/li>\n<li><strong>2021\/3\/18 NAACL2021 Industry Track <\/strong>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Hidetaka Kamigaito*, Peinan Zhang*, Hiroya Takamura, Manabu Okumura (*: equal contribution)<\/strong><br \/>\nAn Empirical Study of Generating Texts for Search Engine Advertising<\/li>\n<li><strong>2021\/3\/11 NAACL-HLT 2021 <\/strong>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Naoki Kobayashi, Tsutomu Hirao, Hidetaka Kamigaito, Manabu Okumura and Masaaki Nagata<\/strong><br \/>\nImproving Neural RST Parsing Model with Silver Agreement Subtrees<\/li>\n<p><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><\/p>\n<p><strong><\/strong><strong><\/strong><strong><\/strong><\/p>\n<p><strong><\/strong><\/ul>\n<p><strong><strong><strong><br \/>\n<\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><br \/>\n<\/strong><strong><\/strong><\/strong><\/p>\n<p><strong><br \/>\n<\/strong><strong><\/strong><strong><\/strong><\/li>\n<p><strong><strong><strong><br \/>\n<\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><br \/>\n<\/strong><strong><\/strong><\/strong><\/p>\n<p><strong><br \/>\n<\/strong><strong><\/strong><strong><\/strong><\/ul>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/strong><\/p>\n<p><strong><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/strong><\/p>\n<p><strong><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/p>\n<ul>\n<li style=\"list-style-type: none;\">\n<ul>\n<li><a href=\"http:\/\/lr-www.pi.titech.ac.jp\/wp_en\/\"><strong>2021\/2\/20 EACL2021 <\/strong><\/a><a href=\"https:\/\/2021.eacl.org\/calls\/demos\/\">System demonstration<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Hidetaka Kamigaito, Jingun Kwon, Young-In Song and Manabu Okumura<\/strong><br \/>\nA New Surprise Measure for Extracting Interesting Relationships between Persons<\/li>\n<li><strong>2021\/2\/13<\/strong> <a href=\"http:\/\/nlpprogress.com\/english\/semantic_parsing.html#rst-dt\">NLP Progress<\/a>\u306b\u4ee5\u4e0b\u306e\u672c\u7814\u7a76\u5ba4\u306e\u6210\u679c\u304c\u63b2\u8f09\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Naoki Kobayashi, Tsutomu Hirao, Hidetaka Kamigaito, Manabu Okumura and Masaaki Nagata<\/strong><br \/>\nTop-down RST Parsing Utilizing Granularity Levels in Documents (AAAI2020)<br \/>\n<a href=\"http:\/\/nlpprogress.com\/english\/summarization.html\">NLP Progress<\/a>\u3067\u306f\u305d\u306e\u4ed6\u306b\u3082\u672c\u7814\u7a76\u5ba4\u306e\u6210\u679c\u3067\u3042\u308b<br \/>\n<strong>Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nSyntactically Look-Ahead Attention Network for Sentence Compression (AAAI2020)<br \/>\n\u304c\u63b2\u8f09\u3055\u308c\u3066\u3044\u307e\u3059\uff0e<\/li>\n<li><strong>2021\/1\/12<\/strong> <a href=\"https:\/\/2021.eacl.org\/\">EACL2021<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Soichiro Murakami, Sora Tanaka, Masatsugu Hangyo, Hidetaka Kamigaito, Kotaro Funakoshi, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nGenerating Weather Comments from Meteorological Simulations<br \/>\n<strong>Lya Hulliyyatus Suadaa, Hiroya Takamura, Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nMetric-Type Identification for Multi-Level Header Numerical Tables in Scientific Papers<br \/>\n<strong>Chenlong Hu, Yukun Feng, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nOne-class Text Classification with Multi-Modal Deep Support Vector Data Description<\/li>\n<li><strong>2020\/11\/26<\/strong> \u4ee5\u4e0b\u306e\u7814\u7a76\u304c<a href=\"https:\/\/www.ipsj.or.jp\/award\/yamasita2020-detail.html\">2020\u5e74\u5ea6\uff08\u4ee4\u548c2\u5e74\u5ea6\uff09\u5c71\u4e0b\u8a18\u5ff5\u7814\u7a76\u8cde<\/a>\u3092\u53d7\u8cde\u3044\u305f\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u4e0a\u57a3\u5916\u82f1\u525b, \u5965\u6751\u5b66<\/strong><br \/>\n\u968e\u5c64\u7684\u306a\u6ce8\u610f\u6a5f\u69cb\u306b\u57fa\u3065\u304d\u7d71\u8a9e\u7684\u306a\u5148\u8aad\u307f\u3092\u884c\u3046\u6587\u5727\u7e2e<\/li>\n<li><strong>2020\/11\/26<\/strong> \u4ee5\u4e0b\u306e\u7814\u7a76\u304c<a href=\"https:\/\/www.ai-gakkai.or.jp\/about\/award\/jsai_award-conf\/#CONFERENCE\">2020\u5e74\u5ea6(\u7b2c34\u56de\uff09\u4eba\u5de5\u77e5\u80fd\u5b66\u4f1a\u5168\u56fd\u5927\u4f1a\u512a\u79c0\u8cde<\/a>\u3092\u53d7\u8cde\u3044\u305f\u3057\u307e\u3057\u305f.<br \/>\n<strong>\u5c71\u7530 \u60a0\u53f3, \u85e4\u7530 \u7d9c\u4e00\u90ce, \u67f4\u7530 \u77e5\u79c0, \u5c0f\u6797 \u96bc\u4eba, \u7530\u53e3 \u62d3\u660e, \u5965\u6751 \u5b66<\/strong><br \/>\n\u4eba\u624b\u8a55\u4fa1\u3092\u8003\u616e\u3057\u305f\u5f37\u5316\u5b66\u7fd2\u306b\u57fa\u3065\u304f\u30cb\u30e5\u30fc\u30b9\u898b\u51fa\u3057\u751f\u6210<\/li>\n<li><strong>2020\/11\/16 2020\/12\/3 15:05\u3088\u308a\uff0c\u672a\u6765\u7523\u696d\u6280\u8853\u7814\u7a76\u6240\u30bb\u30df\u30ca\u30fc\u306b\u3066\uff0c\u5965\u6751\u6559\u6388\u304c\u6700\u8fd1\u306e\u7814\u7a76\u6210\u679c\u3092\u3054\u7d39\u4ecb\u3044\u305f\u3057\u307e\u3059\uff0e\u3054\u8208\u5473\u306e\u3042\u308b\u65b9\u306f\uff0c<\/strong><strong><a href=\"https:\/\/www.iir.titech.ac.jp\/openlab\/future\/\">\u3053\u3061\u3089<\/a>\u304b\u3089\u304a\u7533\u3057\u8fbc\u307f\u304f\u3060\u3055\u3044\uff0e<br \/>\n\u307e\u305f\u540c\u65e510:00-15:00\u307e\u3067\u306e\u9593\u3067\uff0czoom\u306b\u3066\u7814\u7a76\u5ba4\u516c\u958b\u3082\u884c\u3046\u4e88\u5b9a\u3067\u3059\uff0e\u7814\u7a76\u5ba4\u516c\u958b\u306b\u3064\u3044\u3066\u306f<a href=\"https:\/\/www.iir.titech.ac.jp\/openlab\/future\/#laboratory\">\u3053\u3061\u3089<\/a><br \/>\n\u304b\u3089\u304a\u7533\u3057\u8fbc\u307f\u304f\u3060\u3055\u3044\uff0e<br \/>\n10:00\u304b\u3089\u306813:00\u304b\u3089\u306e1\u6642\u9593\u7a0b\u5ea6\u30672\u56de\u958b\u50ac\u3059\u308b\u4e88\u5b9a\u3067\u3059\u304c\uff0c\u304a\u7533\u3057\u8fbc\u307f\u72b6\u6cc1\u306b\u3088\u3063\u3066\u5897\u6e1b\u3059\u308b\u53ef\u80fd\u6027\u304c\u3042\u308a\u307e\u3059\uff0e<\/strong><\/li>\n<li><strong>2020\/9\/30<\/strong> <a href=\"https:\/\/coling2020.org\/\">COLING2020<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Jingun Kwon, Hidetaka Kamigaito, Young-In Song and Manabu Okumura<\/strong><br \/>\nHierarchical Trivia Fact Extraction from Wikipedia Articles<br \/>\n<strong>Shogo Fujita, Tomohide Shibata and Manabu Okumura<\/strong><br \/>\nDiverse and Non-redundant Answer Set Extraction on Community QA based on DPPs<br \/>\n<strong>Shogo Fujita, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nPointing to Subwords for Generating Function Names in Source Code<br \/>\n<strong>Riku Kawamura, Tatsuya Aoki, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nNeural text normalization leveraging similarities of strings and sounds<\/li>\n<li><strong>2020\/9\/30<\/strong> <a href=\"http:\/\/aacl2020.org\/calls\/papers\/\">AACL-IJCNLP2020<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Yukun Feng, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nA Simple and Effective Usage of Word Clusters for CBOW Model<\/li>\n<li><strong>2020\/9\/15<\/strong> <a href=\"https:\/\/anlp.jp\/guide\/saitaku.html\">\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>\u77f3\u57a3 \u9054\u4e5f,\u753a\u7530 \u548c\u54c9,\u5c0f\u6797 \u96bc\u4eba,\u9ad8\u6751 \u5927\u4e5f,\u5965\u6751 \u5b66<\/strong><br \/>\n\u8cea\u554f\u2015\u56de\u7b54\u30da\u30a2\u3092\u6d3b\u7528\u3059\u308b\u534a\u6559\u5e2b\u3042\u308a\u62bd\u51fa\u578b\u8cea\u554f\u8981\u7d04\u30e2\u30c7\u30eb\u3068\u305d\u306e\u5b66\u7fd2\u6cd5<\/li>\n<li><strong>2020\/7\/3<\/strong> <a href=\"https:\/\/eccv2020.eu\/\">ECCV2020<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>Soichiro Fujita, Tsutomu Hirao, Hidetaka Kamigaito, Manabu Okumura, Masaaki Nagata<\/strong><br \/>\nSODA: Story Oriented Dense Video Captioning Evaluation Framework<\/li>\n<li><strong>2020\/6\/19<\/strong> <a href=\"https:\/\/www.anlp.jp\/abst\/vol27\/no2.html\">\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f.<br \/>\n<strong>\u6751\u4e0a \u8061\u4e00\u6717,\u6e21\u9089 \u4eae\u5f66,\u5bae\u6fa4 \u5f6c,\u4e94\u5cf6 \u572d\u4e00,\u67f3\u702c \u5229\u5f66,\u9ad8\u6751 \u5927\u4e5f,\u5bae\u5c3e \u7950\u4ecb<\/strong><br \/>\n\u6642\u7cfb\u5217\u682a\u4fa1\u30c7\u30fc\u30bf\u304b\u3089\u306e\u5e02\u6cc1\u30b3\u30e1\u30f3\u30c8\u306e\u81ea\u52d5\u751f\u6210<br \/>\n<strong>\u77f3\u5ddd \u958b,\u9ad8\u6751 \u5927\u4e5f,\u5965\u6751 \u5b66<\/strong><br \/>\n\u591a\u69d8\u306a\u30a4\u30d9\u30f3\u30c8\u8868\u73fe\u3092\u5bfe\u8c61\u3068\u3057\u305f\u77e5\u8b58\u62bd\u51fa\u306b\u304a\u3051\u308b\u7570\u306a\u308b\u30a8\u30f3\u30b3\u30fc\u30c7\u30a3\u30f3\u30b0\u30e2\u30c7\u30eb\u7fa4\u306e\u52d5\u7684\u30a2\u30f3\u30b5\u30f3\u30d6\u30eb<\/li>\n<li><strong>2020\/4\/13<\/strong>&nbsp;\u672c\u7814\u7a76\u5ba4OB\u3067\u3042\u308b<a href=\"https:\/\/sites.google.com\/site\/tamuakihp\/\">\u7530\u6751\u6643\u88d5<\/a>\u3055\u3093\u304c<a href=\"https:\/\/se.doshisha.ac.jp\/education\/course\/information\/overview.html\">\u540c\u5fd7\u793e\u5927\u5b66\u7406\u5de5\u5b66\u90e8\u60c5\u5831\u30b7\u30b9\u30c6\u30e0\u30c7\u30b6\u30a4\u30f3\u5b66\u79d1<\/a>\u51c6\u6559\u6388\u306b\u7740\u4efb\u3055\u308c\u307e\u3057\u305f\uff0e<\/li>\n<li><strong>2020\/4\/13<\/strong>&nbsp;\u672c\u7814\u7a76\u5ba4OB\u3067\u3042\u308b<a href=\"https:\/\/www.tuins.ac.jp\/society\/teacher_shinmori_akihiro.html\">\u65b0\u68ee\u662d\u5b8f<\/a>\u3055\u3093\u304c<a href=\"https:\/\/www.tuins.ac.jp\/society\/\">\u5bcc\u5c71\u56fd\u969b\u5927\u5b66\u73fe\u4ee3\u793e\u4f1a\u5b66\u90e8<\/a>\u6559\u6388\u306b\u7740\u4efb\u3055\u308c\u307e\u3057\u305f\uff0e<\/li>\n<li><strong>2020\/4\/13<\/strong>&nbsp;\u672c\u7814\u7a76\u5ba4\u306b<a href=\"https:\/\/researchmap.jp\/funakoshi.k\">\u8239\u8d8a\u5b5d\u592a\u90ce<\/a>\u51c6\u6559\u6388\u304c\u7740\u4efb\u3055\u308c\u307e\u3057\u305f\uff0e<\/li>\n<li><strong>2019\/12\/10<\/strong>&nbsp;<a href=\"https:\/\/ecir2020.org\">ECIR2020<\/a>\u306b\u4ee5\u4e0b\u306e4\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Tatsuya Ishigaki, Hen-Hsen Huang, Hiroya Takamura, Hsin-Hsi Chen and Manabu Okumura<\/strong><br \/>\nNeural Query-biased Abstractive Summarization Using Copying Mechanism<br \/>\n<strong>Tatsuya Ishigaki, Kazuya Machida, Hayato Kobayashi, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nDistant Supervision for Extractive Question Summarization<br \/>\n<strong>Kazuya Machida, Tatsuya Ishigaki, Hayato Kobayashi, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nSemi-Supervised Extractive Question Summarization Using Question-Answer Pairs<br \/>\n<strong>Soichiro Fujita, Hayato Kobayashi and Manabu Okumura<\/strong><br \/>\nUnsupervised Ensemble of Ranking Models for News Comments Using Pseudo Answers<\/li>\n<li><strong>2019\/12\/6<\/strong>&nbsp;\u60c5\u5831\u51e6\u7406\u5b66\u4f1a<a href=\"https:\/\/nl-ipsj.or.jp\/2019\/11\/12\/nl243-program\/\">\u7b2c243\u56de\u81ea\u7136\u8a00\u8a9e\u51e6\u7406\u7814\u7a76\u4f1a(NL\u7814)<\/a>\u306b\u304a\u3044\u3066\uff0c\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c<a href=\"https:\/\/nl-ipsj.or.jp\/award\/\">\u512a\u79c0\u7814\u7a76\u8cde<\/a>\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<strong><br \/>\n\u4e0a\u57a3\u5916\u82f1\u525b, \u5965\u6751\u5b66<\/strong><br \/>\n\u968e\u5c64\u7684\u306a\u6ce8\u610f\u6a5f\u69cb\u306b\u57fa\u3065\u304d\u7d71\u8a9e\u7684\u306a\u5148\u8aad\u307f\u3092\u884c\u3046\u6587\u5727\u7e2e<\/li>\n<li><strong>2019\/11\/11<\/strong>\u3000<a href=\"https:\/\/aaai.org\/Conferences\/AAAI-20\/\">AAAI 2020<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Naoki Kobayashi, Tsutomu Hirao, Hidetaka Kamigaito, Manabu Okumura and Masaaki Nagata<\/strong><br \/>\nTop-down RST Parsing Utilizing Granularity Levels in Documents<br \/>\n<strong>Hidetaka Kamigaito and Manabu Okumura<\/strong><br \/>\nSyntactically Look-Ahead Attention Network for Sentence Compression<\/li>\n<li><strong>2019\/9\/20<\/strong>\u3000<a href=\"https:\/\/www.idiap.ch\/workshop\/DiscoMT\">DiscoMT 2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Takumi Ohtani, Hidetaka Kamigaito, Masaaki Nagata, Mababu Okumura<\/strong><br \/>\nContext-aware Neural Machine Translation with Coreference Information<\/li>\n<li><strong>2019\/9\/18<\/strong>&nbsp;\u60c5\u5831\u51e6\u7406\u5b66\u4f1a<a href=\"https:\/\/nl-ipsj.or.jp\/2019\/08\/01\/nl241-program\/\">\u7b2c241\u56de\u81ea\u7136\u8a00\u8a9e\u51e6\u7406\u7814\u7a76\u4f1a(NL\u7814)<\/a>\u306b\u304a\u3044\u3066\uff0c\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c<a href=\"https:\/\/nl-ipsj.or.jp\/young-award\/\">\u82e5\u624b\u5968\u52b1\u8cde<\/a>\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<strong><br \/>\n\u9577\u8c37\u5ddd\u99ff, \u4e0a\u57a3\u5916\u82f1\u525b, \u5965\u6751\u5b66<\/strong><br \/>\n\u751f\u6210\u578b\u6587\u8981\u7d04\u306e\u305f\u3081\u306e\u62bd\u51fa\u6027\u306b\u7740\u76ee\u3057\u305f\u30c7\u30fc\u30bf\u9078\u629e<\/li>\n<li><strong>2019\/9\/5<\/strong>\u3000<a href=\"https:\/\/www.inlg2019.com\">INLG2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Yuta Hitomi, Yuya Taguchi, Hideaki Tamori, Ko Kikuta, Jiro Nishitoba, Naoaki Okazaki, Kentaro Inui and Manabu Okumura<\/strong><br \/>\nA Large-Scale Multi-Length Headline Corpus for Analyzing Length-Constrained Headline Generation Model Evaluation<\/li>\n<li><strong>2019\/8\/29<\/strong>\u3000<a href=\"https:\/\/www.conll.org\/2019\">CoNLL2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Yukun Feng, Hidetaka Kamigaito, Hiroya Takamura<\/strong><strong>&nbsp;and Manabu Okumura<\/strong><br \/>\nA Simple and Effective Method for Injecting Word-level Information into Character-aware Neural Language Models<\/li>\n<li><strong>2019\/8\/14<\/strong>\u3000<a href=\"https:\/\/www.emnlp-ijcnlp2019.org\">EMNLP-IJCNLP2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Naoki Kobayashi, Tsutomu Hirao, Kengo Nakamura, Hidetaka Kamigaito, Manabu Okumura and Masaaki Nagata<\/strong><br \/>\nSplit or Merge: Which is Better for Unsupervised RST Parsing?<\/li>\n<li><strong>2019\/7\/16<\/strong>\u3000<a href=\"https:\/\/webintelligence2019.com\">Web Intelligence 2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Jingun Kwon, Naoki Kobayashi, Hidetaka Kamigaito, Hiroya Takamura, and Manabu Okumura<\/strong><br \/>\nBridging between emojis and kaomojis by learning their representations from linguistic and visual information<br \/>\n<strong>Chahine Koleejan, Hiroya Takamura, and Manabu Okumura<\/strong><br \/>\nGenerating Objective Summaries of Sports Matches Using Social Media<\/li>\n<li><strong>2019\/7\/7<\/strong>\u3000<a href=\"http:\/\/lml.bas.bg\/ranlp2019\/start.php\">RANLP2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\uff0e<br \/>\n<strong>Tatsuya Ishigaki, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nDiscourse-aware Hierarchical Attention Network for Extractive Single-Document Summarization<\/li>\n<li><strong>2019\/6\/17<\/strong>&nbsp;\u60c5\u5831\u51e6\u7406\u5b66\u4f1a<a href=\"https:\/\/nl-ipsj.or.jp\/2019\/05\/24\/nl240-program\/\">\u7b2c240\u56de\u81ea\u7136\u8a00\u8a9e\u51e6\u7406\u7814\u7a76\u4f1a(NL\u7814)<\/a>\u306b\u304a\u3044\u3066\uff0c\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c<a href=\"https:\/\/nl-ipsj.or.jp\/award\/\">\u512a\u79c0\u7814\u7a76\u8cde<\/a>\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u77f3\u57a3 \u9054\u4e5f, \u9ec3 \u701a\u8431 (\u56fd\u7acb\u53f0\u6e7e\u5927\u5b66), \u9673 \u4fe1\u5e0c (\u56fd\u7acb\u53f0\u6e7e\u5927\u5b66), \u9ad8\u6751 \u5927\u4e5f, \u5965\u6751 \u5b66<\/strong><br \/>\n\u30b3\u30d4\u30fc\u6a5f\u69cb\u3092\u7528\u3044\u305f\u30af\u30a8\u30ea\u6307\u5411\u30cb\u30e5\u30fc\u30e9\u30eb\u751f\u6210\u578b\u8981\u7d04<\/li>\n<li><strong>2019\/6\/04<\/strong>&nbsp;<a href=\"https:\/\/www.pricai.org\/2019\/\">PRICAI2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Chenlong Hu, Mikio Nakano, Manabu Okumura<\/strong><br \/>\nAutoEncoder Guided Bootstrapping of Semantic Lexicon<\/li>\n<li><strong>2019\/5\/14<\/strong>&nbsp;<a href=\"http:\/\/acl2019.org\">ACL2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Takuya Makino, Tomoya Iwakura, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nGlobal Optimization under Length Constraint for Neural Text Summarization<br \/>\n<strong>Soichiro Fujita, Hayato Kobayashi and Manabu Okumura<\/strong><br \/>\nDataset Creation for Ranking Constructive News Comments<\/li>\n<li><strong>2019\/3\/28<\/strong>&nbsp;<a href=\"https:\/\/www.anlp.jp\/abst\/vol26\/no2.html\">\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>\u9752\u6728 \u7adc\u54c9\uff0c\u7b39\u91ce \u907c\u5e73\uff0c\u9ad8\u6751 \u5927\u4e5f\uff0c\u5965\u6751 \u5b66<\/strong><br \/>\n\u30bd\u30fc\u30b7\u30e3\u30eb\u30e1\u30c7\u30a3\u30a2\u306b\u304a\u3051\u308b\u5358\u8a9e\u306e\u4e00\u822c\u7684\u3067\u306f\u306a\u3044\u7528\u6cd5\u306e\u691c\u51fa<\/li>\n<li><strong>2019\/3\/15<\/strong> <a href=\"http:\/\/www.anlp.jp\/nlp2019\/program.html\">\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c25\u56de\u5e74\u6b21\u5927\u4f1a<\/a>\u306b\u304a\u3044\u3066\uff0c\u4ee5\u4e0b\u306e3\u540d\u304c\u82e5\u624b\u5968\u52b1\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\uff0e<br \/>\n<strong>\u25cb\u753a\u7530\u548c\u54c9<\/strong><br \/>\n\u8cea\u554f-\u56de\u7b54\u5bfe\u3092\u5229\u7528\u3057\u305f\u534a\u6559\u5e2b\u6709\u308a\u62bd\u51fa\u578b\u8cea\u554f\u8981\u7d04<br \/>\n<strong>\u25cb\u77f3\u57a3\u9054\u4e5f<\/strong><br \/>\n\u8ac7\u8a71\u69cb\u9020\u3092\u8003\u616e\u3059\u308b\u968e\u5c64\u7684\u6ce8\u610f\u6a5f\u69cb\u306b\u3088\u308b\u62bd\u51fa\u578b\u30cb\u30e5\u30fc\u30e9\u30eb\u5358\u4e00\u6587\u66f8\u8981\u7d04<br \/>\n<strong>\u25cb\u4eba\u898b\u96c4\u592a<\/strong><br \/>\n\u51fa\u529b\u9577\u5236\u5fa1\u3092\u8003\u616e\u3057\u305f\u898b\u51fa\u3057\u751f\u6210\u30e2\u30c7\u30eb\u306e\u305f\u3081\u306e\u5927\u898f\u6a21\u30b3\u30fc\u30d1\u30b9<\/li>\n<li><strong>2018\/12\/16<\/strong> <a href=\"http:\/\/anlp.jp\/abst\/vol26\/no1.html\">\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>\u77f3\u57a3 \u9054\u4e5f\uff0c\u9ad8\u6751 \u5927\u4e5f\uff0c\u5965\u6751 \u5b66<\/strong><br \/>\n\u8907\u6570\u6587\u8cea\u554f\u3092\u5bfe\u8c61\u3068\u3057\u305f\u62bd\u51fa\u578b\u304a\u3088\u3073\u751f\u6210\u578b\u8981\u7d04<br \/>\n<strong>\u4e09\u6d66 \u5eb7\u79c0\uff0c\u72e9\u91ce \u7adc\u793a\uff0c\u8c37\u53e3 \u5143\u6a39\uff0c\u8c37\u53e3 \u53cb\u7d00\uff0c\u4e09\u6ca2 \u7fd4\u592a\u90ce\uff0c\u5927\u718a \u667a\u5b50<\/strong><br \/>\n\u6728\u69cb\u9020\u3068\u30b0\u30e9\u30d5\u69cb\u9020\u3092\u7528\u3044\u305f\u30aa\u30f3\u30e9\u30a4\u30f3\u8b70\u8ad6\u306b\u304a\u3051\u308b\u8ac7\u8a71\u884c\u70ba\u306e\u5206\u985e<\/li>\n<li><strong>2018\/10\/17<\/strong> <a href=\"http:\/\/id.nii.ac.jp\/1001\/00191512\/\">\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c\u30c7\u30fc\u30bf\u30d9\u30fc\u30b9\uff08TOD\uff09<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u9332\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>\u99ac\u7de4 \u7f8e\u7a42, \u7b39\u91ce \u907c\u5e73, \u9ad8\u6751 \u5927\u4e5f, \u5965\u6751 \u5b66<\/strong><br \/>\n\u8077\u696d\u3054\u3068\u306e\u884c\u52d5\u306b\u95a2\u3059\u308b\u77e5\u8b58\u306e\u53ce\u96c6<\/li>\n<li><strong>2018\/11\/01<\/strong> <a href=\"https:\/\/aaai.org\/Conferences\/AAAI-19\/\">AAAI2019<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Yasufumi Taniguchi, Yukun Feng, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nGenerating Live Soccer-Match Commentary from Play Data<\/li>\n<li><strong>2018\/09\/07<\/strong> <a href=\"https:\/\/inlg2018.uvt.nl\">INLG2018<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Abdurrisyad Fikri, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nStylistically User-Specific Generation<\/li>\n<li><strong>2018\/05\/07<\/strong> <a href=\"http:\/\/coling2018.org\">COLING2018<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Arata Ugawa, Akihiro Tamura, Takashi Ninomiya, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nNeural Machine Translation Incorporating Named Entity<\/li>\n<li><strong>2018\/04\/25<\/strong> 3\u6708\u306b\u4fee\u58eb\u8ab2\u7a0b\u3092\u4fee\u4e86\u3057\u305f\u8c37\u53e3\u6cf0\u53f2\u306b\u3088\u308b\u7814\u7a76\u304c<a href=\"https:\/\/www.nikkei.com\/article\/DGXMZO29812440V20C18A4X90000\/\">\u65e5\u672c\u7d4c\u6e08\u65b0\u805e<\/a>\u3067\u7d39\u4ecb\u3055\u308c\u307e\u3057\u305f\u3002<\/li>\n<li><strong>2018\/03\/16<\/strong> \u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a2017\u5e74\u5ea6\u6700\u512a\u79c0\u8ad6\u6587\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\u3002<br \/>\n<strong>\u7b39\u91ce \u907c\u5e73, \u5965\u6751 \u5b66<\/strong><br \/>\n\u5927\u898f\u6a21\u30b3\u30fc\u30d1\u30b9\u306b\u57fa\u3065\u304f\u65e5\u672c\u8a9e\u4e8c\u91cd\u76ee\u7684\u8a9e\u69cb\u6587\u306e\u57fa\u672c\u8a9e\u9806\u306e\u5206\u6790<\/li>\n<li><strong>2017\/10\/01<\/strong> \u9ad8\u6751\u5927\u4e5f\u51c6\u6559\u6388\u306f\u30c1\u30fc\u30e0\u30ea\u30fc\u30c0\u30fc\u3068\u3057\u3066<a href=\"http:\/\/www.airc.aist.go.jp\/team_a\/\">\u7523\u7dcf\u7814\u4eba\u5de5\u77e5\u80fd\u7814\u7a76\u30bb\u30f3\u30bf\u30fc<\/a>\u306b\u7570\u52d5\u3057\u307e\u3057\u305f\u304c\uff0c<a href=\"http:\/\/www.first.iir.titech.ac.jp\/news\/2017\/detail_328.html\">\u30af\u30ed\u30b9\u30a2\u30dd\u30a4\u30f3\u30c8\u30e1\u30f3\u30c8\u5236\u5ea6<\/a>\u3092\u5229\u7528\u3057\u3066\uff0c\u672c\u5b66\u3067\u306f\u6559\u6388\u3068\u3057\u3066\u5f15\u304d\u7d9a\u304d\u5b66\u751f\u306e\u6307\u5c0e\u3092\u884c\u3044\u307e\u3059\uff0e<\/li>\n<li><strong>2017\/09\/01<\/strong> <a href=\"http:\/\/ijcnlp2017.org\">IJCNLP2017<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Tatsuya Ishigaki, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nSummarizing Lengthy Questions<\/li>\n<li><strong>2017\/07\/01<\/strong> <a href=\"http:\/\/emnlp2017.net\">EMNLP2017<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Tatsuya Aoki, Ryohei Sasano, Hiroya Takamura and Manabu Okumura <\/strong><br \/>\nDistinguishing Japanese Non-standard Usages from Standard Ones<\/li>\n<li><strong>2017\/06\/07<\/strong> <a href=\"http:\/\/webintelligence2017.com\">WI2017<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Soichiro Hirota, Ryohei Sasano, Hiroya Takamura and Manabu Okumura <\/strong><br \/>\nReal-Time Tweet Selection for TV News Programs<\/li>\n<li><strong>2017\/05\/31<\/strong> <a href=\"http:\/\/pacling.ucsy.edu.mm\/pacling\/\">PACLING2017<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Tatsuya Aoki, Katsumasa Yoshikawa, Tetsuya Nasukawa, Hiroya Takamura and Manabu Okumura <\/strong><br \/>\nDetecting Earthquake Survivors with Serious Mental Damage<\/li>\n<li><strong>2017\/03\/31<\/strong> <a href=\" http:\/\/acl2017.org\/\">ACL2017<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Soichiro Murakami, Akihiko Watanabe, Akira Miyazawa, Keiichi Goshima, Toshihiko Yanase, Hiroya Takamura and Yusuke Miyao <\/strong><br \/>\nLearning to generate market comments from stock prices<br \/>\n<strong>Shun Hasegawa, Yuta Kikuchi, Hiroya Takamura and Manabu Okumura<\/strong><br \/>\nJapanese Sentence Compression with a Large Training Dataset<\/li>\n<li><strong>2017\/03\/16<\/strong> \u4fee\u58eb\u8ab2\u7a0b2\u5e74\u306e\u6751\u4e0a\u8061\u4e00\u6717\u304c\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c23\u56de\u5e74\u6b21\u5927\u4f1a\u3067\u82e5\u624b\u5968\u52b1\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\u3002<\/li>\n<li><strong>2017\/03\/16<\/strong> \u51c6\u6559\u6388\u306e\u9ad8\u6751\u5927\u4e5f\u304c\u8a00\u8a9e\u51e6\u7406\u5b66\u4f1a\u7b2c23\u56de\u5e74\u6b21\u5927\u4f1a\u3067\u512a\u79c0\u8cde\u3092\u53d7\u8cde\u3057\u307e\u3057\u305f\u3002<\/li>\n<li><strong>2017\/03\/10<\/strong> <a href=\"https:\/\/web.archive.org\/web\/20170503224245\/http:\/\/www.cicling.org\/2017\/\">CICLING2017<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Masanori Hayashi, Ryohei Sasano, Hiroya Takamura and Manabu Okumura <\/strong><br \/>\nJudging CEFR Levels of English Learner's Essays Based on Error-type Identification and Text Quality Measures<\/li>\n<li><strong>2017\/01\/12<\/strong> \u4fee\u58eb\uff12\u5e74\u306e\u6751\u4e0a\u8061\u4e00\u6717\u306b\u3088\u308b\u7814\u7a76\u304c<a href=\"https:\/\/web.archive.org\/web\/20170503224245\/http:\/\/globe.asahi.com\/feature\/article\/2016122700007.html?page=3\">\u671d\u65e5\u65b0\u805e<\/a>\u3067\u7d39\u4ecb\u3055\u308c\u307e\u3057\u305f\u3002<\/li>\n<li><strong>2016\/12\/02<\/strong> <a href=\" http:\/\/eacl2017.org\/\">EACL2017<\/a>\u306b\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c\u63a1\u629e\u3055\u308c\u307e\u3057\u305f\u3002<br \/>\n<strong>Hiroya Takamura, Ryo Nagata and Yoshifumi Kawasaki<\/strong><br \/>\nAnalyzing Semantic Change in Japanese Loanwords<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p><strong><strong><!-- \u30e1\u30f3\u30d0\u30fc\u5199\u771f  --><\/strong><\/strong><\/p>\n<p><center><\/p>\n<table>\n<tbody>\n<tr>\n<td><a href=\"https:\/\/web.archive.org\/web\/20170503224245\/http:\/\/lr-www.pi.titech.ac.jp\/wp\/wp-content\/uploads\/2018\/lab_trip2018.jpg\"><img loading=\"lazy\" class=\"alignnone size-full wp-image-1362\" src=\"http:\/\/lr-www.pi.titech.ac.jp\/wp\/wp-content\/uploads\/2018\/lab_trip2018.jpg\" alt=\"\u30bc\u30df\u5408\u5bbf2018\" width=\"459\" height=\"305\"><br \/>\n<\/a><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p><\/center><\/p>\n<p><!-- \u6240\u5c5e\u7d44\u7e54\u7b49\u3078\u306e\u30ea\u30f3\u30af --><\/p>\n<h2>\u30ea\u30f3\u30af<\/h2>\n<ul>\n<li><a href=\"https:\/\/www.isct.ac.jp\/ja\">\u6771\u4eac\u79d1\u5b66\u5927\u5b66<\/a><\/li>\n<li><a href=\"http:\/\/educ.titech.ac.jp\/ict\/\">\u5de5\u5b66\u9662\u60c5\u5831\u901a\u4fe1\u7cfb<\/a><\/li>\n<li><a href=\"http:\/\/www.first.iir.titech.ac.jp\/index.html\">\u672a\u6765\u7523\u696d\u6280\u8853\u7814\u7a76\u6240<\/a><\/li>\n<\/ul>\n<ul>\n<li style=\"list-style-type: none;\">\n<ul>\u65e7\u6240\u5c5e<\/ul>\n<\/li>\n<\/ul>\n<ul>\n<li><a href=\"http:\/\/www.igs.titech.ac.jp\/\">\u7dcf\u5408\u7406\u5de5\u5b66\u7814\u7a76\u79d1<\/a><\/li>\n<li><a href=\"http:\/\/www.dis.titech.ac.jp\/\">\u77e5\u80fd\u30b7\u30b9\u30c6\u30e0\u79d1\u5b66\u5c02\u653b<\/a><\/li>\n<li><a href=\"http:\/\/www.ip.titech.ac.jp\/\">\u7269\u7406\u60c5\u5831\u30b7\u30b9\u30c6\u30e0\u5c02\u653b<\/a><\/li>\n<\/ul>\n<p><img class=\"mceWpPHP mceItemNoResize\" title=\"php\" src=\"https:\/\/web.archive.org\/web\/20170503224245im_\/http:\/\/lr-www.pi.titech.ac.jp\/wp\/wp-content\/plugins\/php-execution-plugin\/assets\/trans.gif\" alt=\"PD9waHAgLyogZWMzX2dldF9ldmVudHMoNSk7ICovID8+\"><\/p>\n<p><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><strong><\/strong><\/p>\n<p><\/strong><\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>\u65e5\u672c\u8a9e\/English \u5965\u6751\u30fb\u8239\u8d8a\u7814\u7a76\u5ba4\u3067\u306f\uff0c\u3053\u3068\u3070\u3092\u8a08\u7b97\u6a5f\u3067\u51e6\u7406\u3059\u308b\u6280\u8853(\u81ea\u7136\u8a00\u8a9e\u51e6\u7406)\u306b\u95a2\u3059\u308b\u7814\u7a76\u3068\uff0c\u305d\u306e\u6280\u8853\u3092\u7528\u3044\u305f\u5fdc\u7528\u30b7\u30b9\u30c6\u30e0\u306e\u958b\u767a\u3092\u884c\u306a\u3063\u3066\u3044\u307e\u3059\uff0e \u3053\u3068\u3070\u306e\u7406\u89e3\u3068\u3044\u3046\u30c6\u30fc\u30de\u3067\u306f\uff0c\u96e3\u3057\u3044\u3068\u3055\u308c\u308b\uff0c\u610f\u5473\uff0c\u6587\u8108\u7406\u89e3 [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":[],"_links":{"self":[{"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=\/wp\/v2\/pages\/17"}],"collection":[{"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=17"}],"version-history":[{"count":222,"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=\/wp\/v2\/pages\/17\/revisions"}],"predecessor-version":[{"id":2093,"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=\/wp\/v2\/pages\/17\/revisions\/2093"}],"wp:attachment":[{"href":"https:\/\/www.lr.first.iir.isct.ac.jp\/wp\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=17"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}