Open Access Open Access  Restricted Access Subscription Access
Open Access Open Access Open Access  Restricted Access Restricted Access Subscription Access

Relation Extraction using Deep Learning Methods-A Survey


Affiliations
1 Department of Computer Science and Engineering, Government Engineering College Sreekrishnapuram, India
2 Department of Computer Science and Engineering, College of Engineering Trivandrum, India
     

   Subscribe/Renew Journal


Relation extraction has an important role in extracting structured information from unstructured raw text. This task is a crucial ingredient in numerous information extraction systems seeking to mine structured facts from text. Nowadays, neural networks play an important role in the task of relation extraction. The traditional non deep learning models require feature engineering. Deep Learning models such as Convolutional Neural Networks and Long Short Term Memory networks require less feature engineering than non-deep learning models. Relation Extraction has the potential of employing deep learning models with the creation of huge datasets using distant supervision. This paper surveys the current trend in Relation Extraction using Deep Learning models.

Keywords

Relation Extraction, Deep Learning, LSTM, CNN, Word Embeddings.
Subscription Login to verify subscription
User
Notifications
Font Size

  • Sachin Pawar, Girish K. Palshikar and Pushpak Bhattacharyya, “Relation Extraction: A Survey”, Proceedings of International Conference on Computation and Language, pp. 1-51, 2017.
  • Marc Moreno Lopez and Jugal Kalita, “Deep Learning applied to NLP”, Proceedings of International Conference on Computation and Language, pp. 1-15, 2017.
  • Yan Lecun, Yoshua Bengio and Geoffrey Hinton, “Deep Learning”, Nature, Vol. 521, pp. 436-444, 2015.
  • Yoshua Bengio, Rejean Ducharme, Pascal Vincent and Christian Jauvin, “A Neural Probabilistic Language Model”, Journal of Machine Learning Research, Vol. 3, pp. 1137-1155, 2003.
  • Omer Levy, Yoav Goldberg and Ido Dagan, “Improving Distributional Similarity with Lessons Learned from Word Embeddings”, Transactions of the Association for Computational Linguistics, Vol. 3, pp. 211-225, 2015.
  • Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado and Jeffrey Dean, “Distributed Representations of Words and Phrases and their Compositionality”, Proceedings of 26th International Conference on Neural Information Processing Systems, Vol. 2, pp. 3111-3119, 2013.
  • Jeffery Pennington, Richard Socher and Christopher D. Manning, “GloVe: Global Vectors for Word Representation”, Proceedings of International Conference on Empirical Methods in Natural Language Processing, pp. 1532-1543, 2014.
  • Cicero Dos Santos, Bing Xiang and Bowen Zhou, “Classifying Relations by Ranking with Convolutional Neural Networks”, Proceedings of 7th International Conference on Natural Language Processing, pp. 626-634, 2015.
  • Yan Xu, Lili Mou, Ge Li, Yunchuan Chen, Hao Peng and Zhi Jin, “Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths”, Proceedings of International Conference on Empirical Methods in Natural Language Processing, pp. 1785-1794, 2015.
  • Katrin Fundel, Robert Kuffner and Ralf Zimmer, “RelEx: Relation Extraction using Dependency Parse Trees”, Bioinformatics, Vol. 23, No. 3, pp. 365-371, 2007.
  • David H. Hubel and Torsten N. Wiesel, “Receptive Fields and Functional Architecture of Monkey Striate Cortex”, Journal of Physiology, Vol. 195, No. 1, pp. 215-243, 1968.
  • CS231n: Convolutional Neural Networks for Visual Recognition, Available at: http://cs231n.github.io/convolutional-networks/
  • Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou and Jun Zhao, “Relation Classification via Convolutional Deep Neural Network”, Proceedings of 25th International Conference on Computational Linguistics: Technical Papers, pp. 2335-2344, 2014.
  • Yatian Shen and Xuanjing Huang, “Attention-Based Convolutional Neural Network for Semantic Relation Extraction”, Proceedings of 26th International Conference on Computational Linguistics: Technical Papers, pp. 2526-2536, 2016.
  • Linlin Wang, Zhu Cao, Gerardde Melo and Zhiyuan Liu, “Relation Classification via Multi-Level Attention CNNs”, Proceedings of 26th International Conference on Computational Linguistics: Technical Papers, pp. 1298-1307, 2016.
  • Kun Xu, Yansong Feng, Songfang Huang and Dongyan Zhao, “Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling”, Proceedings of International Conference on Empirical Methods in Natural Language Processing, pp. 536-540, 2015.
  • Sepp Hochreiter and Jrgen Schmidhuber, “Long Short-Term Memory”, Neural Computation, Vol. 9, No. 8, pp. 1735-1780, 1997.
  • Mihai Surdeanu, Julie Tibshirani, Ramesh Allapati and Christopher D Manning, “Multi-Instance Multi-Label Learning for Relation Extraction”, Proceedings of Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 455-465, 2012.
  • Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva and Preslav Nakov, “SemEval-2010 Task 8: Multi-Way Classification of Semantic Relations Between Pairs of Nominals”, Proceedings of 5th International Workshop on Semantic Evaluation, pp. 333-338, 2010.
  • Yang Liu, Furu Wei, Sujian Li, Heng Ji, Ming Zhou and Houfeng Wang, “A Dependency-Based Neural Network for Relation Classification”, Proceedings of 53rd Annual Meeting of the Association for Computational Linguistics and 7th International Joint Conference on Natural Language Processing, pp. 285-290, 2015.
  • Shu Zhang, Dequan Zheng, Xinchen Hu and Ming Yang, “Bi-Directional Long Short-Term Memory Networks for Relation Classification”, Proceedings of 29th Pacific Asia Conference on Language, Information and Computation, pp. 73-78, 2015.
  • Menglong Wu, Lin Liu, Wenxi Yao, Chunyong Yin and Jin Wang, “Semantic Relation Classification by Bi-Directional LSTM Architecture”, Advanced Science and Technology Letters, Vol. 143, pp. 205-210, 2017.
  • Makoto Miwa and Mohit Bansal, “End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures”, Proceedings of 54th Annual Meeting of the Association for Computational Linguistics, Vol. 1, pp. 1105-1118, 2016.
  • Peng Zhou, Wei Shi, Jun Tian, Zhenyu Qi, Bingchen Li, Hongwei Hao and Bo Xu, “Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification”, Proceedings of 54th Annual Meeting of the Association for Computational Linguistics, pp. 207-212, 2016.
  • Nanyun Yan Xu, Ran Jia, Lili Mou, Ge Li, Yunchuan Chen, Yangyang Lu and Zhi Jin, “Improved Relation Classification by Deep Recurrent Neural Networks with Data Augmentation”, Proceedings of 26th International Conference on Computational Linguistics, pp. 1-10, 2016.
  • Rui Cai, Xiao dong Zhang and Houfeng Wang, “Bidirectional Recurrent Convolutional Neural Network for Relation Classification”, Proceedings of 54th Annual Meeting of the Association for Computational Linguistics, pp. 756-765, 2016.
  • Mike Mintz, Steven Bills, Rion Snow and Dan Jurafsky, “Distant Supervision for Relation Extraction without Labelled Data”, Proceedings of 4th International Conference on Natural Language Processing, pp. 1003-1011, 2009.
  • Kurt Bollacker, Colin Evans, Praveen Paritosh, Tim Sturge, and Jamie Taylor, “Freebase: A Collaboratively Created Graph Database for Structuring Human Knowledge”, Proceedings of ACM International Conference on Management of Data, pp. 1247-1250, 2008.
  • Raphael Hoffmann, Congle Zhang, Xiao Ling, Luke Zettlemoyer and Daniel S Weld, “Knowledge-based Weak Supervision for Information Extraction of Overlapping Relations”, Proceedings of 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 541-550, 2011.
  • Daojian Zeng, Kang Liu, Yubo Chen and Jun Zhao, “Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks”, Proceedings of International Conference on Empirical Methods in Natural Language Processing, pp. 1753-1762, 2015.
  • Guoliang Ji, Kang Liu, Shizhu He and Jun Zhao, “Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions”, Proceedings of 31st International Conference on Artificial Intelligence, pp. 1132-1139, 2017.
  • Sebastian Riedel, Limin Yao and Andrew McCallum, “Modeling Relations and their Mentions without Labeled Text”, Proceedings of European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 148-163, 2010.
  • Jenny Rose Finkel, Trond Grenager and Christopher Manning, “Incorporating Non-local Information into Information Extraction Systems by Gibbs Sampling”, Proceedings of 43rd Annual Meeting of the Association for Computational Linguistics, pp. 363-370, 2005.

Abstract Views: 218

PDF Views: 0




  • Relation Extraction using Deep Learning Methods-A Survey

Abstract Views: 218  |  PDF Views: 0

Authors

C. A. Deepa
Department of Computer Science and Engineering, Government Engineering College Sreekrishnapuram, India
P. C. ReghuRaj
Department of Computer Science and Engineering, Government Engineering College Sreekrishnapuram, India
Ajeesh Ramanujan
Department of Computer Science and Engineering, College of Engineering Trivandrum, India

Abstract


Relation extraction has an important role in extracting structured information from unstructured raw text. This task is a crucial ingredient in numerous information extraction systems seeking to mine structured facts from text. Nowadays, neural networks play an important role in the task of relation extraction. The traditional non deep learning models require feature engineering. Deep Learning models such as Convolutional Neural Networks and Long Short Term Memory networks require less feature engineering than non-deep learning models. Relation Extraction has the potential of employing deep learning models with the creation of huge datasets using distant supervision. This paper surveys the current trend in Relation Extraction using Deep Learning models.

Keywords


Relation Extraction, Deep Learning, LSTM, CNN, Word Embeddings.

References