inproceedings by Ho-jin Choi
Papers by Ho-jin Choi
ACIS International Conference on Computer and Information Science, 2005
Rational Unified Process (RUP) provides a component-based development process which is use-case d... more Rational Unified Process (RUP) provides a component-based development process which is use-case driven, architecture-centric, and iterative and incremental. This paper describes our experience of RUP application to the development of a web-based project management system, which had been peformed by a five-member team for one year. The paper introduces how we applied RUP in the development of the system and
Pacific Rim International Conference on Artificial Intelligence, 2002
The MLP-based speaker verification system can have to provide fast speaker enrollment process as ... more The MLP-based speaker verification system can have to provide fast speaker enrollment process as well as high speaker recognition rate and quick speaker verification process. The multilayer perceptron (MLP) presents higher pattern recognition rate without assuming underlying density distribution against the existing parametric pattern recognition methods, and enables rapid operation by sharing internal parameters between various models. Among a variety

2008 8th IEEE International Conference on Computer and Information Technology, 2008
Use case scenario has been commonly used for single products. However, when used for software pro... more Use case scenario has been commonly used for single products. However, when used for software product lines, it raises new issues to consider. In software product lines, products share common features and additionally have their own unique sets of features where the latter can be represented by so called variability model. When various combinations of variants are selected, they should be selected such that they obey the constraints imposed by variability model. Therefore, the use cases developed for a product line cannot be used straightforwardly for products. In this paper, we provide a systematic way to mapping the constraints in a variability model called OVM to use case scenarios using the notion of tagged use case scenario. We also present an algorithm for automatically generating product use case scenarios based on OVM model and tagged use case scenarios.
2015 International Conference on Big Data and Smart Computing (BIGCOMP), 2015
Lecture Notes in Computer Science, 2007
In broadband networks, dynamic service change affects the management system because of the variab... more In broadband networks, dynamic service change affects the management system because of the variability in time to market, transmission quality and transport system. Therefore, reusability and maintainability become the important quality factors in Network Management System (NMS). An aspect-oriented software development method is a feasible solution for the evolvability of NMS. In this paper, we propose a method to generate aspects in a standard management information model using Aspect Conversion and Metric Calculation (ACMC) algorithm. We applied it ITU-T M.3100 and evaluated it via the ratio of reduced redundancy in point of crosscutting concerns.

Lecture Notes in Computer Science, 2007
The PSP (Personal Software Process) was developed to help developers make high-quality products t... more The PSP (Personal Software Process) was developed to help developers make high-quality products through improving their personal software development processes. With consistent measurement and analysis activities that the PSP suggests, developers can identify process deficiencies and make a reliable estimate on effort and quality. However, due to the high-overhead and context-switching problem of manual data recording, developers have difficulties to collect reliable data, which can lead to wrong analysis results. Also, it is very inconvenient to use the paper-based process guide of the PSP in navigating its process information and difficult to attach additional process-related information to the process guide. In this paper, we describe a PSP supporting tool that we have developed to deal with these problems. The tool provides automated data collection and analysis to help acquire reliable data and identify process deficiencies. It also provides an EPG (Electronic Process Guide) in order to provide easy access and navigation of the PSP process information, which is integrated with an ER (Experience Repository) to allow developers to store development experiences.

TheScientificWorldJournal, 2014
We present a novel approach for computing link-based similarities among objects accurately by uti... more We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the "Random Walk with Restart" strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance o...

5th ACIS International Conference on Software Engineering Research, Management & Applications (SERA 2007), 2007
The advent of software process models such as CMM/CMMI (Capability Maturity Model/Capability Matu... more The advent of software process models such as CMM/CMMI (Capability Maturity Model/Capability Maturity Model Integration) has helped software engineers understand principles and approaches of software process improvement. There, however, has been difficulty increasing productivity from applying those models since "how" is not within the scope of the CMM/CMMI. For this reason, SEI (Software Engineering Institute) introduced PSP/TSP (Personal Software Process/Team Software Process); however, they still lack statistical analysis tools and systematic process control techniques for analyzing measures collected in PSP/TSP. Six Sigma, on the other hand, provides the quantitative analysis tools necessary to identify high leverage activities, control process performance and evaluate effectiveness of process changes. Deploying PSP/TSP in conjunction with Six Sigma, therefore, can directly lead to improved project performance and continuous process improvement by analyzing data, assessing process stability, and prioritizing improvements in PSP/TSP. Continuing with this rationale, a framework that guides how and where Six Sigma tools are considered in PSP/TSP is proposed.

The 9th International Conference on Advanced Communication Technology, 2007
The definition of multicast routing was already described more than 20 years ago. Many routing al... more The definition of multicast routing was already described more than 20 years ago. Many routing algorithms and the evaluation of these algorithms based on software techniques have been proposed since the year 2000. Multicast is required in various environments with consideration of transportation quality and different requirements. This means the requirements of multicast service increase steadily and it's the time to meet economic and time-to-market with quality of service. In this paper, we describe the process of analysis, design and implementation of applying Overlay Multicast Protocol (OMP) to application software according to the arrangement with a client that wanted to apply specific requirements. In perspectives of software engineering, we share our experiences and development process for applying OMP.

Data broadcasting is an efficient method for disseminating data, and is widely accepted in the da... more Data broadcasting is an efficient method for disseminating data, and is widely accepted in the database applications of mobile computing environments because of its asymmetric communication bandwidth between a server and mobile clients. This requires new types of concurrency control mechanism to support mobile transactions executed in the mobile clients, which have low-bandwidths toward the server. In this paper, we propose an OCC/DTA (Optimistic Concurrency Control with Dynamic Timestamp Adjustment) protocol that can be efficiently adapted to mobile computing environments. The protocol reduces communication overhead by using client-side validation procedure and enhances transaction throughput by adjusting serialization order without violating transaction semantics. We show that the proposed protocol satisfies data consistency requirements, and simulate that this protocol can improve the performance of mobile transactions in data broadcasting environments.
Journal of Vacuum Science & Technology A: Vacuum, Surfaces, and Films, 1997
Methodologies for Intelligent Systems, 1991
Allen & Koomen' s interval planner and Dean & McDermott's time map manager (TMM) offer ... more Allen & Koomen' s interval planner and Dean & McDermott's time map manager (TMM) offer different approaches to temporal database management in planning. In this paper we present a temporal planning system that integrates ideas from both methods, and at ...
Performing a good component selection plays a critical role in the success of the final system. A... more Performing a good component selection plays a critical role in the success of the final system. Although the history of component selection process in component based software development is almost a decade old, we have found that no selection process addresses the use of previous decision experience for selecting components for similar requirements. In this research we argue that previous
Artificial Intelligence and Pattern Recognition, 2007

5th IEEE/ACIS International Conference on Computer and Information Science and 1st IEEE/ACIS International Workshop on Component-Based Software Engineering,Software Architecture and Reuse (ICIS-COMSAR'06), 2006
The advent of software process models such as CMM/CMMI has helped software engineers understand p... more The advent of software process models such as CMM/CMMI has helped software engineers understand principles and approaches of software process improvement. There, however, has been difficulty increasing productivity from applying these models since "how" is not within the scope of the CMM/CMMI. For this reason, SEI introduced PSP/TSP, but those still lack analysis tools and systematic process control techniques for analyzing metrics collected in PSP/TSP. Six Sigma, on the other hand, provides the quantitative analysis tools necessary to control process performance. Deploying PSP/TSP in conjunction with Six Sigma, therefore, can enable software engineers to analyze PSP data, and to systematically improve process performance. Continuing with this rationale, we map Six Sigma tools to each PSP process in order to show what Six Sigma techniques can be applied to PSP data and suggest Six Sigma practical use guideline to support process improvement activity at the individual and team level.

2010 IEEE 34th Annual Computer Software and Applications Conference, 2010
Although synthesis was considered an important and challenging approach to construction of a prog... more Although synthesis was considered an important and challenging approach to construction of a program or a program model in software development, most of research on synthesis has been devoted to the construction of state machine models or variations of them. Recently, as process modeling through languages like UML Activity Diagram and BPMN appears as a new paradigm of software development, the ability to synthesize models in such languages from requirements would tremendously increase the scope of automatic software development. This paper presents transformation rules for synthesis of UML Activity Diagrams from scenario-based specifications modeled as UML Sequence Diagrams. To that end, we first identify various control flow patterns of Sequence Diagrams and define rules for mapping them to corresponding parts of Activity Diagram. In order to make precise such mapping labeling rules are introduced for the patterns. Also we provide a synthesis algorithm for construction of a UML Activity Diagram from scenarios.
Seventh IEEE/ACIS International Conference on Computer and Information Science (icis 2008), 2008
In software project management, there are three major factors to predict and control; size, effor... more In software project management, there are three major factors to predict and control; size, effort, and quality. Much software engineering work has focused on these. When it comes to software quality, there are various possible quality characteristics of software, but in practice, quality management frequently revolves around defects, and delivered defect density has become the current de facto industry standard. Thus, research related to software quality has been focused on modeling residual defects in software in order to estimate software reliability. Currently, software engineering literature still does not have a complete defect prediction for a software product although much work has been performed to predict software quality.

Having a variety of good characteristics against other pattern recognition techniques, the multil... more Having a variety of good characteristics against other pattern recognition techniques, the multilayer perceptron (MLP) has been used in many applications. But, it is known that the error backpropagation (EBP) algorithm that the MLP uses in learning has the defect that requires relatively long learning time. In order to increase learning speed it is very effective to use the online-based learning methods, which update the weight vector of the MLP pattern by pattern, because the learning data for pattern recognition contain high redundancy. A typical online EBP algorithm applies the fixed learning rate for each update of the weight vector. Though a large amount of speedup with the online EBP can be obtained just by choosing the appropriate fixed rate, fixing the rate has the inefficiency that doesn’t fully utilize the instant updates of the online mode. And, although the patterns come to be divided into the learned and the unlearned during learning process and the learned have no need...

14th Asia-Pacific Software Engineering Conference (APSEC'07), 2007
Six Sigma has been adopted by many software development organizations to identify problems in sof... more Six Sigma has been adopted by many software development organizations to identify problems in software projects and processes, find optimal solutions for the identified problems, and quantitatively improve the development processes so as to achieve organizations' business goals. A Six Sigma framework for software process improvements is needed to provide a standard process and analysis tools for Six Sigma project executions, and also provide a platform for collaborations with other process improvement approaches, such as PSP/TSP and CMM/CMMI. However, few frameworks have been proposed to support Six Sigma project executions. Most of Six Sigma projects for software process improvements have been performed in an ad-hoc way. In this paper, we propose a framework to support Six Sigma projects for continuous process improvements for software developments. Based on this framework, we implemented a web-based tool, called SSPMT integrated with a software project management tool and a PSP supporting tool. The suggested framework and SSPMT is beneficial in initiating and executing Six Sigma projects, facilitating data collection and data analyses by Six Sigma toolkits, and standardizing the Six Sigma project execution process so as to achieve Six Sigma project goals and of organizations' business goals.
Uploads
inproceedings by Ho-jin Choi
Papers by Ho-jin Choi