Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2015
…
7 pages
1 file
The paper presents a novel approach to generating explanations for solving Math Word Problems (MWP) using a dedicated Explanation Generation (EG) module integrated within a Machine Reading framework. The system first analyzes the MWP, constructs logical representations, and then employs natural language generation techniques to articulate the reasoning process behind the answers. The proposed method addresses the unique challenges posed by MWPs, providing a structured and comprehensible explanation of mathematical operations involved.
Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Demonstrations
This paper presents a meaning-based statistical math word problem (MWP) solver with understanding, reasoning and explanation. It comprises a web user interface and pipelined modules for analysing the text, transforming both body and question parts into their logic forms, and then performing inference on them. The associated context of each quantity is represented with proposed role-tags (e.g., nsubj, verb, etc.), which provides the flexibility for annotating the extracted math quantity with its associated syntactic and semantic information (which specifies the physical meaning of that quantity). Those role-tags are then used to identify the desired operands and filter out irrelevant quantities (so that the answer can be obtained precisely). Since the physical meaning of each quantity is explicitly represented with those role-tags and used in the inference process, the proposed approach could explain how the answer is obtained in a human comprehensible way.
2015
This paper proposes a tag-based statistical framework to solve math word problems with understanding and reasoning. It analyzes the body and question texts into their associated tag-based logic forms, and then performs inference on them. Comparing to those rule-based approaches, the proposed statistical approach alleviates rules coverage and ambiguity resolution problems, and our tag-based approach also provides the flexibility of handling various kinds of related questions with the same body logic form. On the other hand, comparing to those purely statistical approaches, the proposed approach is more robust to the irrelevant information and could more accurately provide the answer. The major contributions of our work are: (1) proposing a tag-based logic representation such that the system is less sensitive to the irrelevant information and could provide answer more precisely; (2) proposing a unified statistical framework for performing reasoning from the given text.
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), 2018
We introduce MeSys, a meaning-based approach, for solving English math word problems (MWPs) via understanding and reasoning in this paper. It first analyzes the text, transforms both body and question parts into their corresponding logic forms, and then performs inference on them. The associated context of each quantity is represented with proposed role-tags (e.g., nsubj, verb, etc.), which provides the flexibility for annotating an extracted math quantity with its associated context information (i.e., the physical meaning of this quantity). Statistical models are proposed to select the operator and operands. A noisy dataset is designed to assess if a solver solves MWPs mainly via understanding or mechanical pattern matching. Experimental results show that our approach outperforms existing systems on both benchmark datasets and the noisy dataset, which demonstrates that the proposed approach understands the meaning of each quantity in the text more.
International Journal of Information and Education Technology, 2013
Mathematical word problems represent many real world scenarios. Often timesfound difficult to solve. The reason for their difficulty is because of miscomprehension and the inability to formulate a mathematical representation of the text. In this paper we present a framework based on fuzzy logicontology model that interprets a mathematical word problem in natural language and compute a solution. This framework uses ontology as a working memory and search engine to compute the solution. Fuzzy logic is used to determine the confidence of result returned, which could eliminate the confusion for a user trying to determine if the solution provided is correct. The ability to interpret a mathematical word problem and return a detailed solution will help educated users by providing them detailed steps of a solution.
2018
Word problem solving has always been a challenging task as it involves reasoning across sentences, identification of operations and their order of application on relevant operands. Most of the earlier systems attempted to solve word problems with tailored features for handling each category of problems. In this paper, we present a new approach to solve simple arithmetic problems. Through this work we introduce a novel method where we first learn a dense representation of the problem description conditioned on the question in hand. We leverage this representation to generate the operands and operators in the appropriate order. Our approach improves upon the state-of-the-art system by 3% in one benchmark dataset while ensuring comparable accuracies in other datasets.
2019 14th Conference on Industrial and Information Systems (ICIIS)
Existing approaches for automatically generating mathematical word problems are deprived of customizability and creativity due to the inherent nature of template-based mechanisms they employ. We present a solution to this problem with the use of deep neural language generation mechanisms. Our approach uses a Character Level Long Short Term Memory Network (LSTM) to generate word problems, and uses POS (Part of Speech) tags to resolve the constraints found in the generated problems. Our approach is capable of generating Mathematics Word Problems in both English and Sinhala languages with an accuracy over 90%.
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
Solving algebraic word problems has recently emerged as an important natural language processing task. To solve algebraic word problems, recent studies suggested neural models that generate solution equations by using 'Op (operator/operand)' tokens as a unit of input/output. However, such a neural model suffered two issues: expression fragmentation and operand-context separation. To address each of these two issues, we propose a pure neural model, Expression-Pointer Transformer (EPT), which uses (1) 'Expression' token and (2) operand-context pointers when generating solution equations. The performance of the EPT model is tested on three datasets: ALG514, DRAW-1K, and MAWPS. Compared to the state-of-the-art (SoTA) models, the EPT model achieved a comparable performance accuracy in each of the three datasets; 81.3% on ALG514, 59.5% on DRAW-1K, and 84.5% on MAWPS. The contribution of this paper is two-fold; (1) We propose a pure neural model, EPT, which can address the expression fragmentation and the operandcontext separation. (2) The fully automatic EPT model, which does not use hand-crafted features, yields comparable performance to existing models using hand-crafted features, and achieves better performance than existing pure neural models by at most 40%.
IRJET, 2020
A computer program can easily compute solutions to an equation, but it cannot understand the same problem given in the form of a word problem where the computer needs to identify which equation is to be put and also identify what information is provided in the problem text. Here, we describe a NLP based approach with which a computer can be trained to identify the topic and subtopic and ultimately the equation that is applicable to the solution of the given problem and also identify the known values in the problem. Only a handful of utility software is available today providing this functionality. The ones online provide solutions for template-based problems. This study goes beyond currently used template-based models and uses Parsing Technique for identifying the given information in the paragraph and focuses on TF-IDF measure with SVM classifier for topic classification. The current implementation of the system considers only high school level problems of Physics.
Nowadays, the arithmetic questions that are expressed in natural language such as English are hugely getting interest by researchers. Although some useful researches have been proposed to solve word problems, there are still gaps in implementing a robust arithmetic word problem solver as the answers of the word problems cannot be easily extracted with the approach of keyword or pattern matching. According to this motivation, this research focuses on generating the correct equation from the word problem and deriving the solution. The aim of this proposed work is to implement an arithmetic word problem solver that can understand the elementary math word problems, derive the symbolic equation, and generate the result from the equation. The system is implemented with the combination of the verb semantics and the graph. The elementary student can obtain many benefits since the system is resulted the equation along with the answers. Keywords-Mathematical Word Problem (MWP), Natural Language, Arithmetic Word Problem, Word Problem Solver.
2021
Current neural math solvers learn to incorporate commonsense or domain knowledge by utilizing pre-specified constants or formulas. However, as these constants and formulas are mainly human-specified, the generalizability of the solvers is limited. In this paper, we propose to explicitly retrieve the required knowledge from math problemdatasets. In this way, we can determinedly characterize the required knowledge andimprove the explainability of solvers. Our two algorithms take the problem text andthe solution equations as input. Then, they try to deduce the required commonsense and domain knowledge by integrating information from both parts. We construct two math datasets and show the effectiveness of our algorithms that they can retrieve the required knowledge for problem-solving.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
International Conference on Computers in Education, 2000
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
International Journal of Computers for Mathematical Learning, 1998
Princeton: Educational Testing Service, 2003
Electronics and Communications in Japan Part Iii-fundamental Electronic Science, 1992
arXiv (Cornell University), 2024
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Findings of the Association for Computational Linguistics: EMNLP 2021, 2021
2017 8th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), 2017