Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
6 pages
1 file
—Data migration is one of the vital tasks of Data integration process. It is always assumed to be most tedious as there will never be a systematic defined procedure. Each migration process is to be treated as unique as the input data sets will be different and the output format required is always unique based on the services provided as well as the user and data handler requirements. In the recent years data migration became the most vital process in various departments of public and private services due to technological advancements and big data handling requirements caused by the increase in acquired data volume. This paper discusses about data migration requirement, data migration strategy finalization and various stages of data migration process discussion of each stage and why complete automation of data migration is not feasible etc.
Data migration is the process of moving data from one environment to a new one; it may be used to support migration from one database to another or between major upgrades of a database. The implementation of master data management may also require data migration. The data integration, ETL,ELT and replication, which are primarily concerned with moving data between existing environments, may be used in order to support the migration process .Data migration is often undertaken as a part of a broader application migration (for example: migrating from SAP to Oracle, consolidating SAP environments or migrating from one version of SAP to another. or when migrating to SaaS (software as a service) environments it is important for data migration to be automated as much as possible, especially where these applications have been acquired directly by the business rather than via IT. Data migration projects are undertaken because they support business objectives. There are costs to the business if it goes wrong or if the project is delayed, and the most important factor in ensur¬ing the success of such projects is close collaboration between the busi¬ness and IT. Whether this means that the project should be owned by the business— treated as a business project with support from IT—or wheth¬er it should be delegated to IT with the business overseeing the project is debatable, but what is clear is that it must involve close collaboration. This white paper is about the acquisition of FMG(Fast Moving Goods)business of one company(COM-1) by another company (COM-II) resulting in the merger of the FMG business of COM-I into COM-IIs. It involves migration of huge amount of data from one company to the other resulting from partial M&A between COM-I and COM-II keeping the following parameters in check -Data integrity, Time(duration of engagement),Cost of technology, Man hours sent, Downtime, Maintaining availability of application. Huge data migration is not only cumbersome but requires special tools and techniques for maintaining integrity of the data. Migration of data from one source (company) to the other (company) requires time and effort and has huge cost implications that are not visible on the surface and hence extensive design, planning and funding are needed. Various tools are evaluated for migration of data but owing to the complexity of the existing system which involves Open road as front end, tuxedo at middle tier and Oracle 10g at the backend and there were many critical business rules applied at all the three tiers, that needs to be taken into account while migrating the data. This involved lots of study and research in term of determining the best methodology for migration data from one landscape to another landscape. Please note the two landscapes may be on two entirely different platforms involving lots of complexity and contradictions. There was a need to study in details the application and hardware architecture of both the systems for the purpose of data migration /integration. For the purpose of data migration from one environment to another, all the validation (including Biz validation at front end and middleware and data referential and integrity validations at backend) should be considered and cannot be bypassed for the sake of migration.
Every type of system may replace or enhance the functionality currently delivered by legacy systems to new system, regardless of the type of project/application; some data conversion may take place. Difficulties arise when we take the information currently maintained by the legacy system and transform it to fit into the new system. We refer to this process as data migration. Data migration is a common element among most system implementations. It can be performed once, as with a legacy system redesign, or may be an ongoing process as in storage of historical data in the form of a data warehouse. Some legacy system migrations require ongoing data conversion if the incoming data requires continuous cleansing. It should be that any two systems that maintain the same sort of data must be doing very similar things and, therefore, should map from one to another with ease. Legacy systems have historically proven to be far too lenient with respect to enforcing integrity at the atomic level of data. Another common problem has to do with the theoretical design differences between hierarchical and relational systems. In data migration one method apply in twice (i.e. automated and manual). This paper explores the steps to migrate date in form of manual, i.e. process of data migration without the help of any special tool those made for data migration. Manual data cleaning is commonly performed in migration to improve data quality, eliminate redundant or obsolete information, and match the requirements of the new system in correct and efficient form.
IAEME PUBLICATION, 2024
This article presents a comprehensive analysis of the challenges and solutions associated with automated data migration in cloud environments, addressing the critical needs of modern enterprise digital transformation. Through extensive examination of industry practices, emerging technologies, and case studies, we identify key challenges including data integrity preservation, downtime minimization, and business continuity maintenance. While many organizations are adopting cloud-first strategies, successful migration remains a significant challenge, with 40% of projects exceeding planned downtime and budget allocations.
Modern computer systems are expected to be up continuously: even planned downtime to accomplish system reconfiguration is becoming unacceptable, so more and more changes are having to be made to ?live? systems that are running production workloads. One of those changes is data migration: moving data from one storage device to another for load balancing, system expansion, failure recovery, or a myriad of other reasons. This document gives the overview of all the process involved in Data Migration. Data Migration is a multi-step process that begins with an analysis of the legacy data and culminates in the loading and reconciliation of data into the new applications.
Proceedings of the 2004 ACM SIGMOD international conference on Management of data - SIGMOD '04, 2004
International Journal of INTELLIGENT SYSTEMS AND APPLICATIONS IN ENGINEERING, 2024
In this work, a procedure for converting a relational database (RDB) into an XML document is proposed. Database migration is the process of moving schema and data from a source RDB towards the destination database of XML script, which makes to achieved and moved through new context. The home schema is semantically enhanced and translated into a target schema, and the data in the source database is transformed into a target database based on the new schema. The semantic enrichment procedure is required to create an improved metadata model from the source database and captures key elements of the destination XML schema, making it appropriate for turning RDB data into an XML document. Algorithms are designed for constructing the target database based on a set of migration rules to translate all RDB constructions into an XML Schema, from which RDB data is subsequently transformed. A prototype system has been developed and experimentally assessed by testing its outcomes, looking at our accomplishments, and commenting on the findings. The recommended remedy is found to be effective and accurate after the review of the outcomes.
An important development in information technology, cloud computing allows users to share Internetbased access to pre-con Figured systems and services. While there are many benefits, such cost efficiency and scalability, security is still a big worry for everyone involved. The current practices in authentication have been found to be wanting in providing for the principles of CIA triad; confidentiality, integrity and availability. Data transfer to the cloud is also known as data migration, which takes data from on-premises databases together with other cloud services and which is normally associated with many problems such as data integrity and minimize down time. Additional barriers stem from the continuously maturing cloud environments and different levels of compatibility with the given database structures. This paper focuses on the processes that are involved in data migration and different catalogs of migration including, database migration, data center migration, application migration, business process migration and so on, stressing the significance of planning and implementing these migrations efficiently. The main issues that demand shifting to the cloud are outlined as well as the main approaches that large cloud suppliers such as AWS, M icrosoft Azure, and Google Cloud offer. Additionally, potential risks and challenges, such as vendor selection, security concerns, and resource management, are explored. This comprehensive overview highlights the significance of strategic planning and vendor solutions in ensuring successful cloud data migration, while addressing the inherent risks associated with transitioning to cloud-based infrastructures.
This paper tries to propose a solution for various issues that uncover while migrating the data from old legacy systems to new systems. Business organizations implement new Software Application System to replace the functionalities of their major processes by their old legacy systems from time to time. Data Migration is the procedure of relocating data from one framework then onto the other while changing the capacity, database or application. Complexities arise when there is a try to take the information (data) right from the legacy framework (system) and change or modify it to fit into the new framework (system). Mostly the structure and the data types of the old legacy systems are different in relation to the new system being implemented; the difference is simply not constrained to the table names, field names, properties or sizes. The types of databases are distinct, as also the entity relationships in the new framework may not compatible with the earlier legacy systems. To get the legacy data into its new application format, a certain number of modifications and transformations must take place. These modifications and transformations are known as 'Data Conversion'. During the implementation of new framework, the current structure which is being used by the old legacy system is taken into account and is mapped to the new framework being designed and implemented.
Cloud computing has recently became a widely discussed topic in the IT industry. More and more organizations consider using the Cloud, because it enables an easy and cost efficient way of hosting applications, with dynamic scaling and geographical distribution possibilities. Still, it is not clear how and when cloud computing should be used. Existing application are often written in a way that does not really fit a cloud environment well. Also, certain quality attributes (e.g. performance, security or portability) can be affected. More studies are needed on how existing systems should be plugged into the Cloud and what are the consequences of the migration. Data migration and application migration are one of popular technologies that enable computing and data storage management to be autonomic and self-managing. we examine important issues in designing and developing scalable architectures and techniques for efficient and effective data migration and application migration. The first contribution we have made is to investigate the opportunity of automated data migration across multi-tier storage systems.
It is recommended that specifications are produced as there are two target audiences that need to be informed of the how the data will be migrated. The first audience is the person managing the data migration. Such will need information of which data is being migrated and how, as well as the target of the data. The second audience are the SQL developer, the technical engineer who is responsible for extracting such data. The engineer will need to be informed of the data to be migrated as well as the format of the data being migrated.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Big Data and Cognitive Computing, 2021
Scalable Computing: Practice and Experience, 2011
Empirical Software Engineering, 2017
Proceedings of the 17th …, 1997
… Symposium on Symbolic …, 2010
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2022
Symmetry
International Journal of Research in Engineering and Technology, 2014
Journal of Software Engineering (JSE), 2024