Uom Synopsis-Project 19707 Report 1719732728520
Uom Synopsis-Project 19707 Report 1719732728520
Ecosystem
ALMA Project Implementation
A project report submitted in the partial fulfillment
of the requirement for the award of degree of
Master of Business Administration in Data Science and Business Analytics
Semester: 3rd (Session: 2022-2024)
Mysore – 570006
1|Page
Certificate
2|Page
DECLARATION
I, Chandan Kumar, student of Master of Business Administration in Data Science and Business
Analytics, University of Mysore, Department of Online Programs under class Roll No.
MDS22057 for the session 2022-2024, hereby, declare that the report entitled “A&P Library
Management Ecosystem: ALMA Project Implementation” has been completed by me. I
hereby declare, on behalf of myself that:
i. The matter embodied in this Dissertation is an original work and has not been
submitted earlier for award of any degree or diploma to the best of my knowledge and
belief. Moreover, the Dissertation does not breach any existing copyright or any other
third party rights.
ii. The Dissertation contains no such material that may be illegal and offensive.
I hereby agree to indemnify UOM and its Teaching Staff against any and all losses incurred in
connection with any claim or proceeding assert plagiarism and/or copyright infringement if the
investigation carried out to determines that my work is the plagiarizing or infringing work.
3|Page
ACKNOWLEDGEMENT
Deepest thanks to my Manager Miss Advaita Waikar (MENTOR AND MANAGER) for
her guidance, monitoring, constant encouragement and correcting various assignments of ours
with attention and care. She has taken pain to go through the project and training sessions and
make necessary corrections as when needed and I am very grateful for that.
(Chandan Kumar)
4|Page
PREFACE
The purpose of this report is to assemble under one cover a sufficient body of knowledge about
management and developing a successful computer science engineering project. The following
quotes outline the basic idea behind this technical report. This report assembles various
functions like planning, organizing, designing and testing of the website developed.
In this project i.e. The Library Management Ecosystem forms an advanced network aimed at
facilitating a seamless transition from LEAP and CBD to the cloud-based ALMA system while
maintaining operational continuity for existing legacy systems. It consists of the On Prem LMS
Service, PHX LMS Service, and User Dashboard, each playing a critical role in interfacing
with the sophisticated functionalities of ALMA without disrupting the traditional operations of
the library's legacy systems, such as M1, CASSI, Ecubbed, Sheepdog, CIN, and Authority
Files.
5|Page
Index
Sr.no Title
4 Introduction
6 Project Category
7 Program Structure
9 Future Scope
10 Snapshots
6|Page
A&P Library Management
Ecosystem
ALMA Project Implementation
7|Page
Overview of JavaScript
JavaScript History:
JavaScript, not to be confused with Java, was created in 10 days in May 1995 by Brendan Eich,
then working at Netscape and now of Mozilla. JavaScript was not always known as JavaScript:
the original name was Mocha, a name chosen by Marc Andreessen, founder of Netscape. In
September of 1995 the name was changed to LiveScript, then in December of the same year,
upon receiving a trademark license from Sun, the name JavaScript was adopted. This was
somewhat of a marketing move at the time, with Java being very popular around then.
In 1996 - 1997 JavaScript was taken to ECMA to carve out a standard specification, which
other browser vendors could then implement based on the work done at Netscape. The work
done over this period of time eventually led to the official release of ECMA-262 Ed.1:
ECMAScript is the name of the official standard, with JavaScript being the most well known
of the implementations. ActionScript 3 is another well-known implementation of ECMAScript,
with extensions (see below).
The standards process continued in cycles, with releases of ECMAScript 2 in 1998 and
ECMAScript 3 in 1999, which is the baseline for modern day JavaScript. The "JS2" or "original
ES4" work led by Waldemar Horwat (then of Netscape, now at Google) started in 2000 and at
first, Microsoft seemed to participate and even implemented some of the proposals in their
JScript.net language.
Over time it was clear though that Microsoft had no intention of cooperating or implementing
proper JS in IE, even though they had no competing proposal and they had a partial (and
diverged at this point) implementation on the .NET server side. So by 2003 the JS2/original-
ES4 work was mothballed.
The next major event was in 2005, with two major happenings in JavaScript’s history. First,
Brendan Eich and Mozilla rejoined Ecma as a not-for-profit member and work started on E4X,
ECMA-357, which came from ex-Microsoft employees at BEA (originally acquired as
Crossgain). This led to working jointly with Macromedia, who were implementing E4X in
ActionScript 3(ActionScript 3 was a fork of Waldemar's JS2/original-ES4 work).
So, along with Macromedia (later acquired by Adobe), work restarted on ECMAScript 4 with
the goal of standardizing what was in AS3 and implementing it in SpiderMonkey. To this end,
8|Page
Adobe released the "AVM2", code named Tamarin, as an open source project. But Tamarin
and AS3 were too different from web JavaScript to converge, as was realized by the parties in
2007 and 2008.
Alas, there was still turmoil between the various players; Doug Crockford — then at Yahoo!
— joined forces with Microsoft in 2007 to oppose ECMAScript 4, which led to the
ECMAScript 3.1 effort.
While all of this was happening the open source and developer communities set to work to
revolutionize what could be done with JavaScript. This community effort was sparked in 2005
when Jesse James Garrett released a white paper in which he coined the term Ajax, and
described a set of technologies, of which JavaScript was the backbone, used to create web
applications where data can be loaded in the background, avoiding the need for full page
reloads and resulting in more dynamic applications. This resulted in a renaissance period of
JavaScript usage spearheaded by open source libraries and the communities that formed around
them, with libraries such as Prototype, jQuery, Dojoand Mootools and others being released.
In July of 2008 the disparate parties on either side came together in Oslo. This led to the
eventual agreement in early 2009 to rename ECMAScript 3.1 to ECMAScript 5 and drive the
language forward using an agenda that is known as Harmony.
All of this then brings us to today, with JavaScript entering a completely new and exciting
cycle of evolution, innovation and standardisation, with new developments such as
the Nodejs platform, allowing us to use JavaScript on the server-side, and HTML5 APIs to
control user media, open up web sockets for always-on communication, get data on
geographical location and device features such as accelerometer, and more. It is an exciting
time to learn JavaScript.
JavaScript Features:
There are given many features of JavaScript.
• It is light weighted.
9|Page
• JavaScript is a scripting language and it is not java.
• Most of the javascript control statements syntax is same as syntax of control statements in
C language.
• An important part of JavaScript is the ability to create new functions within scripts.
Declare a function in JavaScript using function keyword.
JavaScript Uses:
There are too many web applications running on the web that are using JavaScript technology
like gmail, facebook,twitter, google map, youtube etc.
Uses of JavaScript
• Client-side validation
• Validate user input in an HTML form before sending the data to a server.
• Change the appearance of HTML documents and dynamically write HTML into separate
Windows.
• Manipulate HTML "layers" including hiding, moving, and allowing the user to drag them
around a browser window.
• Displaying popup windows and dialog boxes (like alert dialog box, confirm dialog box and
prompt dialog box)
Object
Any entity that has state and behavior is known as an object. For example: chair, pen, table,
keyboard, bike etc. It can be physical and logical.
Class
Collection of objects is called class. It is a logical entity.
Inheritance
When one object acquires all the properties and behaviours of parent object i.e. known as
inheritance. It provides code reusability. It is used to achieve runtime polymorphism.
Polymorphism
When one task is performed by different ways i.e. known as polymorphism. For example: to
convenes the customer differently, to draw something e.g. shape or rectangle etc.
Abstraction
Hiding internal details and showing functionality is known as abstraction. For example: phone
call, we don't know the internal processing.
Encapsulation
Binding (or wrapping) code and data together into a single unit is known as encapsulation. For
example: capsule, it is wrapped with different medicines.
Advantage of OOPs over Procedure-oriented programming language
1) OOPs makes development and maintenance easier where as in Procedure-oriented
programming language it is not easy to manage if code grows as project size grows.
2) OOPs provides data hiding whereas in Procedure-oriented programming language a global
data can be accessed from anywhere.
3) OOPs provides ability to simulate real-world event much more effectively. We can provide
the solution of real word problem if we are using the Object-Oriented Programming language.
11 | P a g e
HTML 5
HTML stands for Hyper Text Markup Language. HTML5 is a language for structuring and
presenting content for the world wide web. It is the fifth revision of the HTML standard
(originally created in 1990 and most recently standardized as HTML4 in 1997).
History:
Tim Berners-Lee invented the world wide web! Tim created the World Wide Web Consortium
(W3C). Its mission was to lead the world wide web to its full potential by developing protocols
and guidelines that ensure long-term growth for the web. In December 1999, HTML 4.01 was
published. After this, the people who ran the W3C declared that it would be difficult to extend
HTML 4 and they gave up on it for the next ten years!
Turns out that web developers got busy trying to bridge the gaps in HTML5 by rolling out
custom controls plug-ins and libraries. And finally, key browser vendors such as Microsoft,
Google, Mozilla and Apple got together to develop these further, finally ending up under the
W3C HTML
working group.
CSS:
CSS is an acronym for Cascading Style Sheets. It is a style sheet language used to describe the
presentation semantics (the look and formatting) of a document written in a markup language.
Although we will deal with the application of CSS to style web pages written in HTML, but
the language can also be applied to any kind of XML document.
12 | P a g e
you wanted to define a “person” object for instance with a few properties, here’s how you
would express it using JSON syntax:64 Introducing JavaScript
var person = {
“name”: “Foo”,
“age”: 10,
“gender”: “female”,
“address”: {
“street”: “street 1”,
“city”: “city 1”,
“pincode”: 12345
}
};
Browsers support the JSON API which allows you to serialize and deserialize objects to and
from strings. Given the “person” object we defined above here’s how you’d produce a string
from it:
var str = JSON.stringify(person);
And here’s how you’d convert that string right back into a “person” object:
var person2 = JSON.parse(str);
With the JSON API handy it becomes rather trivial to send and receive data to and from web
servers.
HTML5 QuerySelector API
This API is used to retrieve elements using standard DOM.
QuerySelector methods include –
1. querySelector () - Return the first element in the page which matches the specified
selector rules(s)
2. querySelectorAll () - Returns all elements which match the specified
rule or rules.
13 | P a g e
ANGULAR
Angular is a popular open-source framework for building dynamic, single-page web
applications (SPAs). It's maintained by Google and leverages TypeScript, a superset of
JavaScript that adds optional static typing.
Building on the foundational concepts, let's explore some advanced aspects of Angular:
14 | P a g e
2. Modules (NgModules):
3. Directives:
• Directives are versatile tools that manipulate the DOM (Document Object Model)
based on specific conditions.
• They extend HTML's capabilities by adding custom behavior to elements or
attributes.
• Angular provides built-in directives for common tasks like handling forms, iterating
over data, and managing structural changes.
• You can also create custom directives to tailor functionality to your specific needs.
4. Pipes:
• Pipes are functions that transform data for display in the template.
• They format dates, currencies, numbers, and perform other transformations to
enhance user experience.
• Angular offers a variety of built-in pipes, and you can create custom pipes for
specialized formatting.
5. Services:
• Services are reusable classes that encapsulate application logic and data access.
• They are often injected into components to perform tasks like fetching data from an
API, handling authentication, or managing application state.
• Services promote separation of concerns, keeping your components focused on UI
logic.
6. Routing:
• Angular's router handles navigation between different views within your SPA.
• It maps URLs to specific components, ensuring the appropriate view is displayed
based on the user's navigation actions.
• This allows you to create complex applications with multiple views and seamless user
experience.
15 | P a g e
Here's a basic example showcasing several core Angular concepts:
1. Component (app.component.ts):
@Component({
selector: 'app-root',
templateUrl: './app.component.html',
styleUrls: ['./app.component.css']
})
export class AppComponent {
title = 'My First Angular App!';
}
This component defines the root element (app-root) of our application. It has a title property
and references the template file (app.component.html) for the view.
2. Template (app.component.html):
HTML
<h1>Welcome to {{ title }}</h1>
This template displays the title property from the component class using data binding ({{
title }}).
3. Module (app.module.ts):
TypeScript
import { NgModule } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
@NgModule({
declarations: [AppComponent],
imports: [BrowserModule],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }
This module defines the building block of our application. It lists the components
(AppComponent) and imports necessary modules (BrowserModule) for functionalities like
rendering in the browser. Finally, it bootstraps the application with the AppComponent.
Explanation:
• This example demonstrates a simple component with a title, displayed in the template
using data binding.
• It showcases the basic structure of an Angular application using components, a
template, and a module.
16 | P a g e
Running the example:
1. Setup: Ensure you have Node.js and npm (Node Package Manager) installed.
2. Create Project: Use the Angular CLI to create a new project: ng new my-angular-
app
3. Replace Code: Replace the generated files in src/app with the code snippets
provided above.
4. Run App: Navigate to the project directory and run: ng serve This starts a
development server and opens the application in your browser (usually at
http://localhost:4200/).
17 | P a g e
INTRODUCTION:
What is ALMA?
Alma Ex Libris is a library management system, also known as an ILS [Integrated Library
System]. It's a software suite that helps libraries manage all their resources, from physical
books and journals to digital media and online databases. Here's a breakdown of what Alma
Ex Libris can do:
• Manage all kinds of library materials: Alma can handle physical and electronic
books and journals, as well as digital resources like audio files, images, and videos.
• Streamline library operations: It helps with tasks like selecting materials for
purchase, acquiring them, cataloging them, managing their use, and digitizing
physical materials.
• Provide insights and analytics: Alma can help librarians understand how their
collections are being used and make data-driven decisions about future acquisitions.
• Cloud-based system: Unlike older library management systems, Alma is cloud-
based, which means libraries don't need to install and maintain software on their own
servers.
Unified Platform: A key advantage of Alma is its unified design. Unlike some library
systems that are cobbled together from separate programs, Alma offers a single platform to
manage everything. This simplifies workflows and saves librarians time by eliminating the
need to switch between different systems.
Open and Customizable: Alma is built to be open and integrate with other library systems
and services. This allows libraries to tailor the system to their specific needs and preferences.
Ex Libris also provides an extensive developer platform with APIs and tools for further
customization.
Focus on User Experience: Alma offers a modern and user-friendly interface that makes it
easy for librarians and library patrons to navigate. The system also features customizable
dashboards and widgets that allow users to see the information that's most important to them.
Collaboration and Community: Ex Libris fosters a user community around Alma. This
allows librarians to share best practices, troubleshoot problems, and provide feedback to the
developers.
Scalability and Security: As a cloud-based solution, Alma is scalable to meet the needs of
libraries of all sizes. Ex Libris takes data security seriously and employs robust measures to
protect library data.
18 | P a g e
Benefits for Library Users: While Alma is primarily aimed at library staff, it also benefits
library users. The system's efficiency can lead to faster processing times for requests and
easier access to library resources. Additionally, Alma can integrate with library discovery
tools, making it easier for users to find the materials they need.
Overall, Alma Ex Libris is a powerful and comprehensive library management system that
can help libraries improve efficiency, gain valuable insights, and provide a better experience
for their users.
19 | P a g e
INTRODUCTION
About Project:
The Library Management Ecosystem forms an advanced network aimed at facilitating a
seamless transition from LEAP and CBD to the cloud-based ALMA system while maintaining
operational continuity for existing legacy systems. It consists of the On Prem LMS Service,
PHX LMS Service, and User Dashboard, each playing a critical role in interfacing with the
sophisticated functionalities of ALMA without disrupting the traditional operations of the
library's legacy systems, such as M1, CASSI, Ecubbed, Sheepdog, CIN, and Authority Files.
Central to this ecosystem is the On Prem LMS Service, which acts as the on-premises
conduit, meticulously monitoring an SFTP site for bibliographic data released by ALMA. It
processes this data into a consumable format for legacy systems that are accustomed to
specific data structures inherent to LEAP and CBD. This service ensures that the integrity
and accessibility of data remain consistent as libraries pivot towards leveraging ALMA's
capabilities.
Complementing this is the PHX LMS Service, which bridges the gap between ALMA's
cloud-based features and the on-premise legacy applications. It does so by providing a set of
administrative APIs for generating and mapping unique identifiers required by ALMA, and a
collection of real-time transformation APIs that emulate LEAP's functionalities, allowing for
uninterrupted interaction with the legacy systems. These APIs facilitate the conversion of
LEAP-formatted requests into corresponding ALMA API calls and vice versa.
Together, this comprehensive ecosystem not only preserves the workflows of the existing
legacy systems but also propels the library's infrastructure into the modern era, fostering
adaptability and resilience. The ecosystem's scope envelops the development,
implementation, and ongoing support of these services, aligning with the library's strategic
goals of digital transformation and future readiness.
20 | P a g e
Project Frame Work:
High-Level Architecture
21 | P a g e
The overarching architecture integrates several key components to modernize bibliographic
data management and bridge the gap between ALMA Cloud and legacy systems. This
architecture ensures continuity of operations while leveraging the advanced capabilities of
ALMA.
Core Components
ALMA Cloud:
• Monitors an SFTP site for data published by ALMA and processes this data into a
format usable by legacy systems such as M1 COPY, M1 CAT, CASSI, and Authority
Files (AllBib).
• Acts as a watcher for new bibliographic entries to ensure they are promptly and
accurately processed for use by on-premises legacy systems.
• Provides two sets of APIs: Admin APIs for internal administrative tasks such as
generating and mapping unique identifiers like CODEN and CATNO, and ALMA
APIs that serve as a real-time data retrieval interface, mimicking LEAP procedures
for the on-premises systems.
• Handles data transformations between the LEAP format and the ALMA API
responses, ensuring seamless communication with legacy systems.
22 | P a g e
User Dashboard:
• The PHX LMS Service's Admin APIs provide librarians with tools to generate and
map unique identifiers, while the ALMA APIs replace the real-time data retrieval
functions of the legacy LEAP system.
• Legacy systems now interact with the PHX LMS Service's APIs, submitting requests
in LEAP format, which are then converted into ALMA API calls and transformed
back into the legacy system's expected response format.
• The User Dashboard allows librarians to perform specific cataloging tasks that
ALMA does not support directly, ensuring that all bibliographic data management
needs are met.
23 | P a g e
Technical Design
Core Components
• MainApplication
o Initialization and Module Integration: Acts as the entry point for the
service, setting up and initiating all necessary modules, including environment
configuration, database connections, and the scheduling of file monitoring
tasks. It orchestrates the startup sequence, ensuring all components are ready
for operation.
• Watcher
o Monitoring File Changes and Triggering Events: Utilizes chokidar to
monitor specified directories on an SFTP server for new files. Upon detecting
new files, it triggers events that signal the system to process these files
accordingly.
• EventManager
o Managing the Lifecycle of Events: Responsible for managing events
triggered by the Watcher. It processes these events by logging their details and
managing their states (e.g., unprocessed, processing, completed) in the
database.
• JobQueue
o Managing the Job Queue for Processing Events: Implements a job
queue using p-queue to handle asynchronous processing of events. It ensures
24 | P a g e
that jobs are executed in a controlled and efficient manner, managing
concurrency and prioritization of tasks.
• JobProcessor
o Handling the Processing of Queued Jobs: Takes jobs from the JobQueue
and executes them. This includes parsing file data, applying necessary
transformations, and preparing the data for downstream systems or further
internal processing.
Feed Services
AbstractFeedService:
It serves as the foundational base class for various feed services within the On Prem LMS
Service. It provides a comprehensive framework for file monitoring, data extraction,
transformation, and interaction with the PostgreSQL database, all tailored to the specific
requirements of handling MARC21 data and its derivatives.
• Core Functionalities:
o Data Extraction and Transformation: Employs Node.js streams and the
fast-xml-parser library to handle the extraction and initial parsing of data from
various file formats, primarily focusing on MARC21. The transformation
processes are designed to convert this data into formats like LEAP or directly
into structured data objects that can be utilized by other components of the
system.
Job Management: Integrates PQueue for managing asynchronous tasks
with controlled concurrency. This ensures that file processing tasks are
handled efficiently without overwhelming the system resources.
o Stream Monitoring: Implements a mechanism to monitor the idle state of
data streams and automatically close them if they are inactive for an extended
period. This helps in managing system resources effectively and ensures that
the file handling processes do not lead to memory leaks or excessive resource
consumption.
• Implementation Details:
o Initialization: During the instantiation of any derived class, the
AbstractFeedService sets up essential configurations, including initializing
logging services, setting up the file monitoring paths, and preparing the system
for receiving and processing data.
o Event Handling: Defines abstract methods such as processEvent which must
be implemented by derived classes to handle specific types of data or events.
This pattern allows for the easy addition of new feed types and handling
procedures without altering the core service logic.
o Resource Cleanup: Implements methods to clean up resources, such as
destroyOutputFileWriteStream and clearStreamMonitoringInterval, which
ensure that the system does not retain unused resources after the processing
tasks are completed.
25 | P a g e
M1CatFeedService:
• Core Functionalities:
o Event Processing: Implements the processEvent method to handle specific
file events associated with M1 category files. This involves reading the file
content, parsing the MARC21 data, and applying necessary transformations to
align with M1 cataloguing requirements.
o File Extraction and Parsing: Utilizes custom parsing methods to extract
MARC21 data from incoming files. This includes sophisticated error handling
and data validation to ensure that the data conforms to expected formats and
standards.
o Data Transformation: Converts the extracted MARC21 data into the
specific format required by the M1 system. This might include reformatting
data fields, mapping MARC fields to M1-specific fields, and applying
business logic that is pertinent to the categorization and cataloguing processes
of M1.
o Output File Management: Manages the generation and storage of
transformed files, ensuring they are saved in appropriate locations and are
readily accessible for the M1 system. This involves naming conventions,
directory management, and ensuring data integrity during the write operations.
• Implementation Details:
o Custom Job Handling: Overrides the getJob method to handle jobs
specifically tailored for M1_CAT_FILE processing. This includes setting up
job-specific parameters, handling job sequencing, and ensuring that each job is
logged and tracked correctly.
o Integration with Legacy Systems: Ensures compatibility with legacy
systems by maintaining data formats and protocols expected by the M1
system. This might involve legacy data handling techniques, custom libraries
for data transformation, and interfaces designed for older system architectures.
M1CopyFeedService
• Core Functionalities:
o Event Handling: Implements the processEvent method to specifically
manage M1_COPY_FILE events. This method is responsible for orchestrating
the entire workflow of reading, processing, and transforming bibliographic
and holdings data into the format required by the M1 copy management
system.
26 | P a g e
Data Extraction and Transformation: Focuses on extracting data from
o
structured MARC21 files and transforming it into the specific data structure
required by the M1 copy system. This includes parsing standard bibliographic
data as well as extracting and reformatting holdings and item-specific
information.
o File and Data Management: Manages the output of transformed files to
ensure they are correctly formatted, named, and stored in designated
directories for easy retrieval and integration by the M1 system.
• Implementation Details:
o Customized Data Handling: Tailors data handling procedures to meet the
specific needs of copy-related data processing. This includes custom parsing
logic to handle the intricacies of holdings and item data in MARC21 format.
CassiFeedService
• Core Functionalities:
o Event Processing: The service reads data from files triggered by
CASSI_FILE events, processes this data according to the CAS requirements,
and prepares it for integration or storage.
o Data Transformation: This service is responsible for transforming
bibliographic information into the specific format required by CAS. It ensures
that all necessary data fields are correctly formatted and validated.
• Implementation Details:
o Integration with CAS Systems: CassiFeedService ensures that the output
is compatible with the existing databases and systems used by the Chemical
Abstracts Service, facilitating accurate data exchange and integration.
AllBibFeedService
• Core Functionalities
o Event Processing: Initiates processing of files from ALL_BIB_FILE events.
It is responsible for parsing and transforming bibliographic MARC21 data into
multiple specific formats required for different downstream systems.
o Multiple Output Files Generation: Generates several types of output files
based on the bibliographic data processed. These include:
▪ Authority Family Titles: For cataloging and metadata enhancement.
▪ Authority Notes: Contains annotations and additional information
for bibliographic entries.
27 | P a g e
▪ Authority File LEAP NV: Processes and stores normalized view data
for authority control.
▪ Authority File LEAP NV Big: Handles larger datasets of normalized
view bibliographic information.
▪ Authority File CIN NV: Specifically formatted for Chemical
Indexing Notes.
▪ Authority File CIN CA NV: Combines Chemical Abstracts and
indexing notes.
▪ Authority File Dropped Titles: Maintains records of titles that are
not carried forward in new bibliographic updates.
• Implementation Details
o Data Transformation and Formatting: Applies specialized
transformation logic to convert raw bibliographic data into structured formats
required by each output file. This includes parsing, data normalization, and
applying specific business rules relevant to bibliographic information
handling.
o Streamlined Data Handling: Utilizes advanced file handling and streaming
technologies to manage large datasets efficiently, ensuring that even extensive
bibliographic entries are processed swiftly and accurately.
Transformation Services
MarcToLeapService
• Core Functionalities
o Data Conversion: Efficiently transforms MARC21 bibliographic records
into LEAP format, which is designed to be more adaptable for modern web
services and applications within the library ecosystem.
o Field Mapping: Maps various MARC fields to their corresponding LEAP
fields, ensuring that all relevant bibliographic information is retained and
accurately represented in the new format.
• Implementation Details
o Standard Compliance: Ensures that the conversion process adheres to
established standards for both MARC and LEAP, maintaining the integrity
and accuracy of bibliographic data throughout the transformation.
o Customizable Transformations: Provides flexibility to adapt the
transformation process based on specific requirements or updates in the
MARC and LEAP specifications, allowing for customized handling of
bibliographic data.
28 | P a g e
LeapToSgmlService
• Core Functionalities
o Data Transformation: Converts LEAP format data into SGML, which is
essential for creating structured documents that can be used in digital archives
and publishing platforms.
o Template-Driven Conversion: Utilizes templates to ensure that the SGML
output adheres to specific document standards and styles required for various
publishing needs within the library.
• Implementation Details
o Flexibility and Scalability: Designed to handle varying complexities of
LEAP data, ensuring that even the most detailed bibliographic records are
accurately represented in the SGML format.
o Integration with Publishing Tools: The service is engineered to integrate
seamlessly with existing publishing tools and workflows, allowing for the
smooth transition of data from library systems to publishing platforms.
29 | P a g e
Development Environment: Local Setup
Local Setup Instructions for PHX LMS Service and On Prem LMS Service
The local development environment setup ensures that developers can run and test the
applications on their machines before deploying to production. Here are the steps to set up the
PHX LMS Service and On Prem LMS Service locally:
1. Install Node.js:
o Ensure Node.js version 20 is installed on your local machine. You can
download it from Node.js official website.
2. Install PostgreSQL(Optional):
o Install PostgreSQL and create the databases as specified in the .env files for
each service. Ensure that the user has the necessary permissions to create,
modify, and delete data within these databases.
o If you already have DB instance available(e.g. AWS RDS) use it.
3. Clone the Repository:
o Clone the repositories for both PHX LMS Service and On Prem LMS Service
from codecommit to your local machine.
4. Install Dependencies:
o Navigate to the root directory of each project in your command line interface
and run:
npm install
o
This command installs all the necessary Node.js packages defined in
package.json.
5. Global Installation of PM2:
o PM2 is used for process management. Install it globally using npm:
6. Environment Configuration:
o Set up the environment variables as detailed in the .env files for each service.
Make sure all paths and database credentials are correctly configured for your
local setup.
o For the PHX LMS Service, ensure an AWS profile is configured correctly to
fetch database credentials from AWS Secret Manager if connecting to an
AWS development database.
7. Database Migrations:
o Run database migrations to set up the initial schema and any subsequent
updates. Navigate to the project directory and run:
30 | P a g e
o
This step prepares the database with the required tables and relationships for
the service.
8. Starting the Application:
o To start each application, use the PM2 commands provided in the
package.json scripts. For example, to start the On Prem LMS Service in
development mode, run:
o
For the PHX LMS Service, use the corresponding PM2 start command.
9. Verify the Setup:
o After setup, access the services via the browser at the APP_URL and
APP_PORT to ensure they are running correctly. Access the Swagger UI for
the PHX LMS Service at:
plaintextCopy code
http://localhost:8080/swagger
o This URL will allow to interact with the API and confirm that the services are
operational.
31 | P a g e
Deployment Instructions for On Prem LMS Service
Pre-Deployment Setup
Before deploying the On Prem LMS Service, ensure that a suitable Virtual Machine (VM) is
available. The VM must meet the following criteria:
Software Installation
1. Node.js: Install Node.js, which is the runtime environment for the service. Ensure it's
a version compatible with the service, ideally the latest LTS (Long Term Support)
version.
2. PostgreSQL Client (if direct database access is required for management or
troubleshooting).
3. Additional Dependencies:
o PM2: Install PM2 globally using npm install pm2 -g. PM2 is a process
manager for Node.js applications, which helps in managing and keeping the
applications online.
32 | P a g e
ALMA Configuration
SFTP Configuration
1. Initial Run Handling:
o To ensure that the On Prem LMS Service does not process historical data on
its first run, temporarily configure the FTP setting in ALMA to use "Ex Libris
Secure FTP Service" which effectively prevents data from being pushed to the
SFTP directory managed by CAS.
o After the first run, switch the FTP Configuration back to the CAS SFTP
Config to start regular data updates. This step is crucial to ensure that only
delta data relevant to the current operational needs, such as the last hour's
updates for M1 CAT and COPY profiles, are processed.
2. SFTP Server Setup:
o Configure the SFTP server details in ALMA to point to the CAS SFTP server
where the On Prem LMS Service is hosted.
o Ensure that the SFTP credentials provided have the necessary permissions to
write to the WATCH_DIR.
Service Start-up
1. Once all configurations are verified, start the On Prem LMS Service in production
mode using:
2. This command will compile the TypeScript code and start the service using PM2 with
the specified environment settings.
33 | P a g e
Post-Deployment Checks
1. After deployment, monitor the service logs to confirm that it is processing files
correctly and interacting with the database and SFTP site without errors.
2. Check that the output files are being generated in the specified directories and that
they conform to the expected formats.
PM2
PM2 stands for Process Manager 2. It's a popular open-source tool specifically designed to
manage Node.js applications. Here's a breakdown of what PM2 does:
• Keeps your Node.js applications running smoothly: PM2 ensures your applications
stay up and running 24/7. It can automatically restart your application if it crashes or
encounters errors.
• Manages multiple applications: If you're running multiple Node.js applications on
the same server, PM2 lets you easily start, stop, and monitor them all from a central
location.
• Automatic reloading: PM2 can automatically reload your application whenever you
make changes to your code. This means you don't need to manually restart the
application every time you make an update, minimizing downtime.
• Centralized logging: PM2 aggregates logs from all your Node.js applications,
making it easier to troubleshoot issues.
• Load balancing: PM2 can distribute traffic across multiple instances of your
application, improving performance and scalability.
• User-friendly interface: PM2 comes with a command-line interface (CLI) that
makes it easy to manage your applications.
Here are some additional benefits of using PM2:
• Easy to install and use: PM2 can be easily installed using the Node Package
Manager (npm).
• Production-ready: PM2 is a robust tool that's well-suited for production
environments.
34 | P a g e
• Large community: PM2 has a large and active community of developers, which
means there are plenty of resources available to help you get started and troubleshoot
any issues you encounter.
Overall, PM2 is a valuable tool for anyone who develops or manages Node.js applications. It
helps to ensure that your applications are always running smoothly and efficiently.
PM2 offers a variety of commands for managing your Node.js applications. Here are some of
the most commonly used ones:
Process Management:
• pm2 start <script>: Starts a Node.js application. You can specify the script name (e.g.,
app.js) or a path to a JSON configuration file.
• pm2 stop <app_name|id>: Stops a running application. You can specify the
application name you assigned during startup or its process ID.
• pm2 restart <app_name|id>: Restarts a running application. Useful for applying code
changes without downtime (for certain application types).
• pm2 reload <app_name|id>: Reloads a running application with minimal downtime
(suited for applications designed for hot reloading).
• pm2 delete <app_name|id>: Removes an application from PM2's management list.
Monitoring:
• pm2 list|ls|status: Lists all running applications managed by PM2, displaying their
status (started, stopped, etc.).
• pm2 monit: Provides a real-time dashboard showing information about all running
applications, including CPU, memory usage, and uptime.
• pm2 logs <app_name|id>: Shows the logs for a specific application.
• pm2 prettylist|jlist: Displays the application list in a more readable format (prettylist)
or JSON format (jlist).
Other Useful Commands:
• pm2 save: Saves the current process list to a file, allowing PM2 to
35 | P a g e
OBJECTIVES
Project Objectives:
The objective of the project “ALMA” is to facilitate a seamless transition from LEAP and CBD
to the cloud-based ALMA system while maintaining operational continuity for existing legacy
systems.
36 | P a g e
PROJECT CATEGORY
The undergoing project falls under Online Web Portal and On-Prem Application category.
Since the project is mainly responsible for creation of the portal with the online database at
backend and Central to this ecosystem is the On Prem LMS Service, which acts as the on-
premises conduit, meticulously monitoring an SFTP site for bibliographic data released by
ALMA. It processes this data into a consumable format for legacy systems that are accustomed
to specific data structures inherent to LEAP and CBD. This service ensures that the integrity
and accessibility of data remain consistent as libraries pivot towards leveraging ALMA's
capabilities.
Complementing this is the PHX LMS Service, which bridges the gap between ALMA's cloud-
based features and the on-premise legacy applications. It does so by providing a set of
administrative APIs for generating and mapping unique identifiers required by ALMA, and a
collection of real-time transformation APIs that emulate LEAP's functionalities, allowing for
uninterrupted interaction with the legacy systems. These APIs facilitate the conversion of
LEAP-formatted requests into corresponding ALMA API calls and vice versa.
37 | P a g e
PROGRAM STRUCTURE
Analysis Report:
System analysis is the first step towards the software building process. The purpose of system
analysis is to understand the system requirements, identify the data, functional and behavioral
requirements and building the models of the system for better understanding of the system.
In the process of system analysis, one should first understand that, what the present system,
what it does, is how it works (i.e. processes). After analyzing these points, we become able to
identify the problems the present system is facing. Upon evaluating current problems and
desired information (input and output to the system), the analyst looks towards one or more
solutions. To begin with, the data objects, processing functions, and behavior of the system are
defined in detail. After this models, from three different aspects of the system-data, function
and behavior. The models created during the system analysis process helps in better
understanding of data and control flow, functional processing, operational behavioral and
information content.
38 | P a g e
PROJECT DESCRIPTION/ CODE:
Dashboard UI
Home.component.html
<div class="p-4">
<div class="container">
<h1>Logged-in User: {{user?.firstName}}!</h1>
</div>
</div>
Home.component.ts
import { ComponentFixture, TestBed } from '@angular/core/testing';
describe('HomeComponent', () => {
let component: HomeComponent;
let fixture: ComponentFixture<HomeComponent>;
let service: AccountService;
beforeEach(() => {
TestBed.configureTestingModule({
imports: [HttpClientModule],
declarations: [HomeComponent],
providers: [AccountService]
});
fixture = TestBed.createComponent(HomeComponent);
service = TestBed.inject(AccountService);
component = fixture.componentInstance;
fixture.detectChanges();
});
39 | P a g e
LMS.MODULE.TS
@NgModule({
declarations: [
LmsComponent
],
imports: [
FormsModule,
ReactiveFormsModule,
CommonModule,
RouterModule.forChild(LmsRoutes),
SidebarComponent
]
})
export class LmsModule { }
lms.routing.module .ts
40 | P a g e
loadChildren: () =>
import('./coden/coden.module').then(m => m.CodenModule)
},
{
path: 'abbreviate',
loadChildren: () =>
import('./abbreviate/abbreviate.module').then(m => m.AbbreviateModule)
}
]
},
]
Generate-coden.component.ts
<div class="container-fluid">
<div class="text-center">
<h2>Generate Coden</h2>
</div>
<br>
<form [formGroup]="codenForm">
<div class="col-md-6 mx-auto my-3">
<label class="mb-1" for="publicationTitle">Publication
Title:</label>
<input type="text"
class="form-control"
formControlName="publicationTitle"
id="publicationTitle"
placeholder="Enter Publication Title">
</div>
<div class="col-md-6 mx-auto my-3">
<div class="row">
<div class="d-grid col-md-8 my-3">
<div ngbDropdown class="d-inline-block">
<button class="btn btn-outline-primary"
id="dropdownBasic1" ngbDropdownToggle>
{{ codenForm.value.publicationType ||
'Select Publication Type'}}
</button>
<div ngbDropdownMenu aria-
labelledby="dropdownBasic1">
<button (click)="onPublicationTypeChange(p
ublicationTypes.Published_Serial)"
ngbDropdownItem>{{publicationTypes.Published_Serial}}</button>
<button (click)="onPublicationTypeChange(p
ublicationTypes.Title_Change)"
ngbDropdownItem>{{publicationTypes.Title_Change}}</button>
41 | P a g e
<button (click)="onPublicationTypeChange(p
ublicationTypes.Discontinued_Publishing)"
ngbDropdownItem>{{publicationTypes.Discontinued_Publishing}}</button>
<button (click)="onPublicationTypeChange(p
ublicationTypes.Conference_Proceeding)"
ngbDropdownItem>{{publicationTypes.Conference_Proceeding}}</button>
<button (click)="onPublicationTypeChange(p
ublicationTypes.Monographic_Collection)"
ngbDropdownItem>{{publicationTypes.Monographic_Collection}}</button>
<button (click)="onPublicationTypeChange(p
ublicationTypes.Title_Cross_Reference)"
ngbDropdownItem>{{publicationTypes.Title_Cross_Reference}}</button>
</div>
</div>
</div>
<div class="col-md-4 my-3">
<div class="justify-content-md-end">
<button class="btn btn-primary"
type="submit"
[disabled]="!codenForm.valid"
(click)="generateCoden()"
>Generate CODEN</button>
</div>
</div>
</div>
</div>
</form>
<br>
<div class="row" *ngIf="isCodenGenerated">
<div class="col-12 col-lg-6">
<div class="d-flex justify-content-center justify-content-
lg-end">
<div class="col-12 col-lg-6">
<div class="d-flex justify-content-center justify-
content-lg-start">
<div>
<label class="pt-1">CODEN:</label>
</div>
<div class="px-3">
<input type="text"
class="form-control"
[value]="codenValue" (change)="updateCoden($event)">
<div
*ngIf="!errorMessage">{{errorMessage}}</div>
</div>
</div>
</div>
42 | P a g e
</div>
</div>
<div class="col-12 col-lg-6 py-3 py-lg-0">
<div class="d-flex justify-content-center justify-content-
lg-start">
<div class="col-12 col-lg-6">
<div class="d-flex justify-content-center justify-
content-lg-end">
<div>
<button [class]="copyClass"
(click)="copyText()">
{{copyButtonValue}}
<i class="bi bi-check-circle-fill
green" *ngIf="isCopied"></i>
</button>
</div>
<div class="px-3">
<button [class]="mapCodenClass"
(click)="openCodenMappingModal()">Map
CODEN
</button>
</div>
</div>
</div>
</div>
</div>
<div class="col-12 py-3 d-flex justify-content-center">
<ngb-alert #selfClosingAlert [dismissible]="true"
type='success' (closed)="successMessage = ''" *ngIf="successMessage">
<strong>{{successMessage}}</strong>
</ngb-alert>
</div>
</div>
<div class="row justify-content-md-center" *ngIf="loader">
<div class="col-md-3">
<div class="row justify-content-md-start">
<div class="col-md-3">
<label class="pt-1">CODEN:</label>
</div>
<div class="col-md-9">
<ngx-skeleton-loader [theme]="{height: '30px'}"
appearance="line" count="1" direction="horizontal"></ngx-skeleton-
loader>
</div>
</div>
</div>
<div class="col-md-3">
<ngx-skeleton-loader count="2" appearance="line"
43 | P a g e
[theme]="{width: '70px', height: '30px', float:
'inline-end','margin-left': '20px'}">
</ngx-skeleton-loader>
</div>
</div>
Generate-coden.componenet.ts
@Component({
selector: 'app-generate-coden-form',
templateUrl: './generate-coden-form.component.html',
styleUrls: ['./generate-coden-form.component.scss']
})
export class GenerateCodenFormComponent {
@ViewChild('selfClosingAlert', { static: false }) selfClosingAlert!:
NgbAlert;
private _message$ = new Subject<string>();
44 | P a g e
this.codenForm = this.fb.group({
publicationTitle: ['', Validators.required],
publicationType: ['', Validators.required]
});
this._message$
.pipe(
takeUntilDestroyed(),
tap((message) => (this.successMessage = message)),
debounceTime(5000),
)
.subscribe(() => this.selfClosingAlert?.close());
}
onPublicationTypeChange(event: any) {
this.isGenerateDisabled = false;
// this.codenForm.value.publicationType =event;
this.codenForm.patchValue({publicationType: event});
generateCoden() {
this.isCodenGenerated = false;
this.isCopied = false;
this.loader = true;
this.copyButtonValue = 'Copy';
this.codenValue = '';
console.log(this.codenForm.value);
console.log(this.codenForm.value.publicationTitle + " :: " +
this.codenForm.value.publicationType);
const type = this.codenForm.value.publicationType.substring(0,1);
this.codenService
.generateCoden(this.codenForm.value.publicationTitle, type)
.subscribe({
next: (response: any) => {
this.codenValue = response.coden;
this.loader = false;
this.isCodenGenerated = true;
this.errorMessage = ''
},
error: (error) => {
this.loader = false;
this.isCopied = true;
if (error != undefined) {
this.errorMessage = error;
}else {
this.errorMessage = 'No Records Found';
}
const modalRef = this.modalService.open(ModelErrorComponent, {
45 | P a g e
size: 'm',
backdropClass: '#5cb3fd'
});
modalRef.componentInstance.errorMessage = this.errorMessage;
},
complete: () => {
this.loader = false;
}
});
}
copyText(): void {
this.copyButtonValue = 'Copied';
this.isCopied = true;
this.copyClass = 'btn btn-outline-success';
navigator.clipboard.writeText(this.codenValue!).catch(() => {
console.error("Unable to copy text");
});
}
openCodenMappingModal() {
this.showSuccessAlert = false;
this.mapCodenClass = 'btn btn-outline-dark';
const modalRef = this.modalService.open(MapCodenModalComponent, {
size: 'lg'
});
modalRef.componentInstance.codenValue = this.codenValue;
modalRef.componentInstance.emitService.subscribe((value: any) => {
if (value === 'success') {
this.changeSuccessMessage(this.codenValue + ' is added
successfully.');
}
});
}
updateCoden(event: any) {
console.log(JSON.stringify(event));
this.codenValue = event.target.value;
}
}
Generate-coden-mapping.component.html
46 | P a g e
<div class="d-flex container-fluid" style="float: right; margin:
20px;">
<button class="btn btn-primary" (click)="launchCodenMappingModel({},
false)">Map CODEN</button>
</div>
<form [formGroup]="filtersForm" (submit)="performFilter()">
<div class="table-scrollable rTable p-3">
<div class="rTableRow-filter" *ngIf="!loader">
<div class="rTableHead-filter">
<input type="text" formControlName="coden"
placeholder="CODEN" class="form-control" (keyup)="onKeydown($event)"
autofocus />
</div>
<div class="rTableHead-filter">
<input type="text" formControlName="catno"
placeholder="CATNO" class="form-control" (keyup)="onKeydown($event)"/>
</div>
<div class="rTableHead-filter">
<input type="text" formControlName="issn" placeholder="ISSN"
class="form-control" (keyup)="onKeydown($event)"/>
</div>
<div class="rTableHead-filter">
<input type="text" formControlName="isbn" placeholder="ISBN"
class="form-control" (keyup)="onKeydown($event)"/>
</div>
<div class="rTableHead-filter">
<input type="text" formControlName="mmsid"
placeholder="MMSID" class="form-control" (keyup)="onKeydown($event)"/>
</div>
</div>
<div class="rTableRow">
<ng-container *ngFor="let t of tableHeader">
<div class="rTableHead" [style]="t.style">
<strong><a href="javascript:void(0)"
(click)="onSort(t.key)"> {{ t.title }}
<span class="d-inline-block position-relative m-1"
style="height: 1rem; width: 1rem;">
<i class="position-absolute" [ngClass]="{'bi bi-sort-
up': isAscending && sortedColumn === t.key, 'bi bi-sort-down':
isDescending && sortedColumn === t.key}" style="font-size: small; top:
5px;"></i>
</span>
</a>
</strong></div>
</ng-container>
<div class="rTableHead"><strong>Action</strong></div>
</div>
<ng-container *ngFor="let coden of codenList index as i">
47 | P a g e
<div class="rTableRow" (dblclick)="rowClickMethod(coden.coden)"
*ngIf="!loader">
<div class="rTableHead text-truncate trunk-text-max" data-bs-
toggle="tooltip" data-bs-placement="bottom" title="{{coden.coden}}"> {{
coden.coden }}</div>
<div class="rTableHead text-truncate trunk-text-max" data-bs-
toggle="tooltip" data-bs-placement="bottom" title="{{coden.catno}}"> {{
coden.catno }}</div>
<div class="rTableHead text-truncate trunk-text-max" data-bs-
toggle="tooltip" data-bs-placement="bottom" title="{{coden.issn}}"> {{
coden.issn }}</div>
<div class="rTableHead text-truncate trunk-text-max" data-bs-
toggle="tooltip" data-bs-placement="bottom" title="{{coden.isbn}}"> {{
coden.isbn }}</div>
<div class="rTableHead text-truncate trunk-text-max" data-bs-
toggle="tooltip" data-bs-placement="bottom" title="{{coden.mmsid}}"> {{
coden.mmsid }} </div>
<div class="rTableHead">
<div class="list-group list-group-horizontal action-menu">
<button (click)="launchCodenMappingModel(coden, true)"
class="btn btn-sm rounded-0" type="button" title="Edit">
<i class="bi bi-pencil-fill"></i>
</button>
<button (click)="deleteCoden(coden)" class="btn btn-sm
rounded-0" type="button" title="Delete">
<i class="bi bi-trash-fill"></i>
</button>
</div>
</div>
</div>
</ng-container>
</div>
<table class="table-scrollable rTable p-4" *ngIf="loader">
<tr *ngFor="let _ of [].constructor(loaderRowCount); let i =
index">
<ng-container *ngFor="let _ of
[].constructor(tableHeader.length); let j = index">
<td class="col-md-1 px-md-3">
<ngx-skeleton-loader appearance="line" count="1"
direction="horizontal"></ngx-skeleton-loader>
</td>
</ng-container>
<td class="col-md-1">
<ngx-skeleton-loader appearance="circle" count="2"
direction="horizontal"></ngx-skeleton-loader>
</td>
</tr>
</table>
48 | P a g e
<div class="ms-3" *ngIf="!codenList?.length">
No record found
</div>
</form>
<div class="d-flex container-fluid" style="float: right; margin: 20px;"
*ngIf="!loader">
<ngb-pagination *ngIf="collectionSize"
(pageChange)="onPageChange($event)" [collectionSize]="collectionSize"
[(page)]="pageNumber" [maxSize]="5"
[boundaryLinks]="true" [pageSize]="pageSize" class="d-flex
justify-content-end"></ngb-pagination>
</div>
Coden-mapping.component.ts
@Component({
selector: 'app-coden-mapping-table',
templateUrl: './coden-mapping-table.component.html',
styleUrls: ['./coden-mapping-table.component.scss']
})
export class CodenMappingTableComponent implements OnInit{
filtersForm: FormGroup;
codenList!: any[];
allCodenList!: any[];
loaderRowCount = 10;
loader = true;
//sort
column ='coden'
direction ='asc';
sortedColumn: string = 'coden';
isAscending: boolean = true;
isDescending: boolean = false;
//pagination
pageNumber = 1;
pageSize = 12; //rows per page
collectionSize: number = 100;
currentPage = 1;
49 | P a g e
//query
queryParams: any = {};
queryParamData : any = {};
//tableHeader
tableHeader = [
{ key: 'coden', title: 'CODEN', style: 'width:20%', sort: true,
direction: 'asc'},
{ key: 'catno', title: 'CATNO', style: 'width:20%', sort: false,
direction: ''},
{ key: 'issn', title: 'ISSN', style: 'width:20%', sort: false,
direction: ''},
{ key: 'isbn', title: 'ISBN', style: 'width:20%', sort: false,
direction: '' },
{ key: 'mmsid', title: 'MMSID', style: 'width:20%', sort: false,
direction: '' }
];
constructor(
private fb: FormBuilder,
private router: Router,
private route: ActivatedRoute,
private modalService: NgbModal,
private codenService: CodenService,
){
this.filtersForm = this.fb.group({
coden: [null],
catno: [null],
issn:[null],
isbn:[null],
mmsid: [null]
});
}
ngOnInit() {
this.queryParams = {
size: this.pageSize,
page: this.pageNumber
}
this.queryParamData = {
page: this.pageNumber,
column: this.column,
direction: this.direction
}
this.updateQueryParams();
this.getCodenMappingList();
}
50 | P a g e
rowClickMethod(event: any) {
//alert('Row double clicked: ' + event);
}
onSort(column: any) {
if (column === this.sortedColumn) {
this.isAscending = !this.isAscending;
this.isDescending = !this.isDescending;
} else {
this.sortedColumn = column;
this.isAscending = true;
this.isDescending = false;
}
this.codenList = [...this.codenList].sort(
(a: any, b: any) => {
const res = this.compare(a[column], b[column]);
return this.isAscending ? res : -res;
}
);
this.queryParamData.column = column;
this.queryParams.sortBy = column;
this.queryParamData.direction = this.isDescending ? 'desc' : 'asc';
this.queryParams.sortOrder = this.isDescending ? 'DESC' : 'ASC';
this.updateQueryParams();
this.getCodenMappingList();
}
onPageChange(pageNumber:any){
this.pageNumber = pageNumber;
this.queryParams.page = this.pageNumber;
this.queryParamData.page = this.pageNumber
this.updateQueryParams();
this.getCodenMappingList();
}
private updateQueryParams() {
this.router.navigate([], {
relativeTo: this.route,
queryParams: this.queryParamData,
});
}
performFilter() {
let queryParamData: any ={};
51 | P a g e
const filterCriteria = this.filtersForm.value;
const nonEmptyFilters = Object.keys(filterCriteria).filter(
(n: any) => filterCriteria[n]
);
let filterBy = '';
let filterValue='';
nonEmptyFilters.forEach((d) => {
if (filterCriteria[d].constructor.name == 'Array' &&
filterCriteria[d][0] != undefined) {
let temp: any[] = [];
filterCriteria[d].forEach((m: any) => {
if(m.item_text)
temp.push(m.item_text);
else
temp.push(m)
});
filterCriteria[d] = temp;
queryParamData[d] = temp;
} else {
queryParamData[d] = filterCriteria[d];
filterCriteria[d] = filterCriteria[d];
filterBy += d + ',';
filterValue += filterCriteria[d]+',';
}
});
if(filterValue) {
this.queryParamData.filterBy = filterBy.slice(0,-1);
this.queryParams.filterBy = filterBy.slice(0,-1);
this.queryParamData.filterValue = filterValue.slice(0,-1);
this.queryParams.filterValue = filterValue.slice(0,-1);
} else {
delete this.queryParamData.filterBy;
delete this.queryParams.filterBy;
delete this.queryParamData.filterValue;
delete this.queryParams.filterValue;
}
this.updateQueryParams();
this.loader = true;
this.getCodenMappingList();
}
}
async deleteCoden(coden: any){
52 | P a g e
console.log('delete coden', coden);
if(confirm("Are you sure to delete coden mapping: "+ coden?.coden))
{
this.loader = true;
this.codenService.deleteCoden(coden?.coden).subscribe((data:
any)=>{
this.getCodenMappingList();
});
}
}
launchCodenMappingModel(coden: any, isEdit: any) {
const newCoden = {...coden};
console.log(newCoden);
newCoden.isEditCoden = isEdit;
this.openCodenMappingModal(newCoden);
}
openCodenMappingModal(coden: any) {
const modalRef = this.modalService.open(MapCodenModalComponent, {
size: 'lg'
});
modalRef.componentInstance.codenValue = coden;
modalRef.componentInstance.emitService.subscribe((value: any) => {
if (value === 'success') {
this.getCodenMappingList();
}
})
}
getCodenMappingList() {
this.loader = true;
this.codenService.getCodenList(this.queryParams)
.subscribe((response: any) => {
this.codenList = response.data;
this.loader = false;
this.allCodenList = this.codenList;
this.collectionSize = response.size;
});
}
}
53 | P a g e
On-Prem-application
MainApplication.ts
class MainApplication {
private jobQueue: JobQueue;
private eventManager: EventManager;
private jobProcessor: JobProcessor;
private watcher: Watcher;
constructor() {
this.jobQueue = new JobQueue(); // Assuming JobQueue needs a reference to
dbService
this.eventManager = new EventManager(this.jobQueue);
this.jobProcessor = new
JobProcessor(parseInt(process.env.MAX_CONCURRENT_JOBS), this.jobQueue,
this.eventManager); // N=10 for max concurrent jobs
this.watcher = new Watcher(process.env.WATCH_DIR, this.eventManager);
}
if (unsetEnvVars.length > 0) {
Logger.error(`Missing required environment variables:
${unsetEnvVars.join(', ')}`);
return false;
}
return true;
}
54 | P a g e
private async initializeDatabaseConnection(): Promise<boolean> {
try {
await dbService.initialize(); // Assuming your PostgreSQLService has an
initialize method to test connection
Logger.info('Database connection successfully established.');
return true;
} catch (error) {
Logger.error('Failed to establish database connection:', error);
return false;
}
}
try {
Object.values(EVENT_TYPE_FOLDER_PATH_MAP).forEach(dirName => {
const fullPath = join(extractDirPath, dirName);
const xmlFullPath = join(fullPath, 'xml');
if (!fs.existsSync(xmlFullPath)) {
Logger.info(`Directory does not exist, creating: ${xmlFullPath}`);
fs.mkdirSync(xmlFullPath, { recursive: true });
}
const processedFullPath = join(fullPath, 'processed');
if (!fs.existsSync(processedFullPath)) {
Logger.info(`Directory does not exist, creating:
${processedFullPath}`);
fs.mkdirSync(processedFullPath, { recursive: true });
}
async start() {
Logger.info('Starting application...');
55 | P a g e
const envCheck = await this.checkEnvironmentVariables();
const dbConnected = await this.initializeDatabaseConnection();
if (!envCheck || !dbConnected) {
Logger.error('Application failed to start due to missing
prerequisites.');
return; // Stop the application startup process if checks fail
}
this.checkTargetDirectories();
await this.eventManager.loadUnProcessedEvents();
this.jobProcessor.start();
this.watcher.start();
async stop() {
Logger.info('Stopping application...');
httpServer.close();
Logger.info('HTTP Server closed');
try {
await dbService.close();
Logger.info('DB Connection closed');
} catch (error) {
Logger.error('DB Connection close Error: ',error);
}
56 | P a g e
private initializeOutputFolders() {
Object.keys(FixedOutputFileNames).forEach(fileName => {
this.createFolderForFile(FixedOutputFileNames[fileName]);
});
}
private createFolderForFile(filePath) {
if (filePath) {
if(filePath.includes('.')) {
filePath = path.dirname(filePath);
}
try {
if (!fs.existsSync(filePath)) {
fs.mkdirSync(filePath, { recursive: true });
}
Logger.info(`Directory created or already exists: ${filePath}`);
} catch (error) {
Logger.error(`Error creating directory: ${error}`);
}
}
}
private setupGracefulShutdown() {
process.on('SIGINT', async () => {
Logger.info('Received SIGINT. Initiating graceful shutdown...');
await this.stop();
});
run() {
this.start();
this.setupGracefulShutdown();
}
}
57 | P a g e
const httpServer = require('http').createServer(server);
httpServer.listen(PORT, () => {
console.log("Server running at PORT: ", PORT);
}).on("error", (error) => {
// gracefully handle error
throw new Error(error.message);
});
POSTGRES
class PostgreSQLService {
private pool: Pool;
constructor() {
Logger.info('DB url is::' + process.env.DATABASE_URL);
this.pool = new Pool({
connectionString: process.env.DATABASE_URL,
max: process.env.PG_POOL_MAX ? parseInt(process.env.PG_POOL_MAX, 10) :
20,
});
}
58 | P a g e
throw error; // Rethrowing the error to handle it in the caller
(MainApplication)
}
}
Event manager
class EventManager {
private jobQueue: JobQueue
constructor(jobQueue: JobQueue) {
this.jobQueue = jobQueue;
}
59 | P a g e
const existingEventCheckSql = 'SELECT COUNT(*) as count FROM events WHERE
file_path = $1;';
const countResult = await dbService.query<{ count: number
}>(existingEventCheckSql, [filePath]);
if (countResult[0].count > 0) {
return true;
}
return false;
}
if (isFileRegistered) {
Logger.info(`Event for ${filePath} already registered.`);
return; // Skip registration if already exists
}
// Insert the new event with UNPROCESSED status into the database
const insertSql = `INSERT INTO events (file_path, event_type, status)
VALUES ($1, $2, $3) RETURNING *;`;
const event = await dbService.query<FileEvent>(insertSql, [filePath,
eventType, EventStatus.UNPROCESSED]);
60 | P a g e
async markEventAsProcessed(eventId: string): Promise<void> {
const updateSql = 'UPDATE events SET status = $1 WHERE id = $2';
await dbService.query(updateSql, [EventStatus.PROCESSED, eventId]);
}
}
JOB QUEUE
class JobQueue {
async enqueue(event: FileEvent): Promise<void> {
// Update event status to ENQUEUED in the database
await this.setJobStatus(event.id, EventStatus.ENQUEUED);
}
61 | P a g e
}
FEEDSERVICE
62 | P a g e
}
JOB PROCESSOR
start() {
this.running = true;
this.pollQueue();
}
stop() {
this.running = false;
FeedServiceFactory.getAllServices().forEach(service => {
service.clearStreamMonitoringInterval();
})
}
63 | P a g e
await new Promise(resolve => setTimeout(resolve, 100));
}
}
}
64 | P a g e
TOOLS/ PLATFORM USED
For the undergoing project, following tools are used :
FUTURE SCOPE
SCOPE OF FUTURE APPLICATION: -
Software scope describes the data and control to be processed, function performance,
constraints, interfaces and reliability. Function describes in the statement of scope are evaluated
and in some case refined to provide more detail prior to the beginning of the estimation.
Because both cost and schedule estimates are functionally oriented, some degree of
decomposition is often useful.
We can easily implement this project. Reusability is possible as and when we require in this
portal. We can update it to an app or can make it more attractive and smooth using CSS. We
can add new features as and when we require. There is flexibility in all the modules. Scope of
this document is to put down the requirements, clearly identifying the information needed by
the user, the source of the information and outputs expected from the system.
Future scope:
It is directly dependent on the lay stone of the project that is we will have to design a system
which when the time passes having a better system initially should not become a joke later.
It is highly likely that the scope will change as the web application project moves forward; the
web process model should be incremental. This allows the development team to “freeze” the
scope for one increment so that an operational application release can be created. This approach
enables the project developer to work without having to accommodate a continual stream of
changes but still recognizes the continuous evolution characteristics of most application.
Besides that, the following basic quality in the software always safeguards the future scope of
the software.
65 | P a g e
Reusability: -
Reusability is possible as and when we require in this application. We can update it next
version. Reusable software reduces design, coding and testing cost by amortizing effort over
several designs. Reducing the amount of code also simplifies understanding, which increases
the likelihood that the code is correct. We follow up both types of reusability: Sharing of newly
written code within a project and reuse of previously written code on new projects.
Extensibility: -
This software is extended in ways that its original developers may not expect. The following
principles enhance extensibility like Hide data structure, avoid traversing multiple links or
methods, avoid case statements on object type and distinguish public and private operations.
Robustness: -
Its method is robust if it does not fail even if it receives improper parameters. There are some
facilities like Protect against errors, optimize after the program runs, validate arguments and
avoid predefined limits.
Understandability: -
A method is understandable if someone other than the creator of the method can understand
the code (as well as the creator after a time lapse). We use the method with small and coherent
helps to accomplish this.
Portability: -
Since it is an Internet based application so its portability and usability depends upon the client
connected with the Internet. The interface designed that is the web page designing which is one
of the major parts of web application because it is the first impression regardless of the value
of its contents interface should grab a potential user immediately.
66 | P a g e
SnapShots of Project:
First Page: LMS DASHBOARD
CODEN - Mapping
67 | P a g e
CODEN-HISTORY
68 | P a g e
ON-PREM APPLICATION:
69 | P a g e
ALMA Page
-END of Project-
70 | P a g e