0% found this document useful (0 votes)
9 views6 pages

Roadmap

The document outlines a comprehensive roadmap for developing the SahyadriCompanion mobile app, detailing ten phases from project setup to final testing and packaging. Each phase includes specific steps for implementing features such as SOS functionality, identification using a machine learning model, offline maps, and user settings. The final structure of the app and its components is also provided, emphasizing the integration of front-end and back-end technologies.

Uploaded by

shreyaj77799
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views6 pages

Roadmap

The document outlines a comprehensive roadmap for developing the SahyadriCompanion mobile app, detailing ten phases from project setup to final testing and packaging. Each phase includes specific steps for implementing features such as SOS functionality, identification using a machine learning model, offline maps, and user settings. The final structure of the app and its components is also provided, emphasizing the integration of front-end and back-end technologies.

Uploaded by

shreyaj77799
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

This roadmap assumes a completely clean project.

Follow these steps exactly to avoid any


errors.

Phase 1: Project Setup & Core Navigation

This phase is about building the foundation. Do not skip any steps.

 Step 1: Start a New Project. Open your terminal and run the following command.
This will create a new Expo project with Expo Router pre-configured for a tab-based
navigation layout. npx create-expo-app@latest --template tabs

 Step 2: Install Core Dependencies. Navigate into your new project folder and install
the necessary libraries for the app's functionality. cd SahyadriCompanion npm
install expo-location expo-sms @react-native-async-storage/async-storage

 Step 3: Modify the app/(tabs)/_layout.tsx file. Open this file. It's the new root for
your app's main tabs. Change the existing code to set up the four tabs for our new
design. I will provide the code for this in the next response.

 Step 4: Create Module Files. Create the necessary files for each screen inside the
app/(tabs)/ directory.

Expected Outcome: You'll have a working app that displays a tab bar at the bottom.
Tapping on each tab will display the corresponding screen. The app will have a solid
foundation for the next phases.

Phase 2: Emergency SOS Module

This phase focuses on the safety feature.

 Step 1: Create the SOS Screen. Create the file for the SOS screen (app/(tabs)/sos.js).
This screen will contain a prominent button.

 Step 2: Create the Contact Service. Create the folder and file at
src/modules/EmergencySOSModule/ContactService.js. This will handle local
storage of emergency contacts.

 Step 3: Implement Logic. Write the code in the SOS screen to get the user's location
and use expo-sms to send a message to the contacts from the service.

Expected Outcome: A functional SOS button that sends a text message with the user's
location to saved contacts.

Phase 3: Identify Module


This phase will build the identification feature, starting with the UI and mock logic.

 Step 1: Create the Identify Screen. Since our new design has a single "Identify" tab,
we'll create one file for it (app/(tabs)/identify.js). This screen will have the camera
UI.

 Step 2: Implement Mock Service. Create the file at


src/modules/IdentifyModule/IdentificationService.js. This service will return a
mock result after a short delay, simulating the ML model.

Expected Outcome: A working camera view and a button that, when pressed, displays a
fake identification result.

Phase 4: Offline Map Module

This phase focuses on the navigation and data display.

 Step 1: Create the Map Screen. Create the file at app/(tabs)/map.js. This screen will
be where the map and GPS information are displayed.

 Step 2: Create the Trek Data File. Create the


src/modules/MapModule/TrekData.json file to store the static trek information.

 Step 3: Implement Logic. Write the code in map.js to get the user's GPS location
and display the trek information from the JSON file.

Expected Outcome: A functional screen that shows the user's live GPS coordinates and
provides details about a sample trek.

Phase 5: Settings Module

This lets users configure emergency contacts, app preferences, etc.

Steps:

1. Create Settings Screen → app/(tabs)/settings.js.

2. Use AsyncStorage to save:

o Emergency contacts (phone numbers).

o Preferences (e.g., distance units, app theme).

3. Provide Input UI (TextInput, Button) for adding/editing contacts.

📌 Expected Outcome: A functional settings page where users can add emergency contacts
and update preferences.
✅ Phase 6: Replace Mock Identify with Real ML Model

This is where we connect the Python-trained ML model to React Native.

Steps:

1. Train/obtain a model in Python (e.g., MobileNet for plants/animals).

o Export as TensorFlow Lite (TFLite) model → model.tflite.

o Create a label file labels.txt.

2. Bundle model into the app:

o Put model.tflite + labels.txt inside assets/model/.

3. Install TFLite support in React Native:

4. npm install react-native-tflite

5. npx pod-install

(if using Expo, you may need a dev build with expo prebuild)

6. Update IdentifyModule:

o Load camera image → preprocess with expo-image-manipulator.

o Run inference with TFLite.

o Display prediction + confidence score.

📌 Expected Outcome: The "Identify" tab actually runs the ML model offline and shows real
predictions.

✅ Phase 7: Offline Maps

Right now you’re using react-native-maps (online). Let’s add offline capability.

Steps:

1. Download trek maps (tiles) beforehand → from OpenStreetMap.

o Use libraries like react-native-offline-maps OR cache tiles in FileSystem.

o Alternatively, bundle pre-downloaded GPX/KML trek routes in


assets/treks/.

2. Install filesystem support:

3. expo install expo-file-system


4. Display trek routes + current GPS location using react-native-maps with custom
tiles.

📌 Expected Outcome: Users can view trek routes even without internet, and GPS still
works offline.

✅ Phase 8: Extra Safety Features

Enhance SOS & user safety.

Steps:

1. Add background location tracking (expo-location).

2. Save last known coordinates to AsyncStorage.

3. Extend SOS to also:

o Flash light (expo-device or react-native-torch).

o Play loud alarm sound (expo-av).

📌 Expected Outcome: More robust SOS button that shares location and alerts people
around.

✅ Phase 9: Community & Extras

(Not MVP, but future features you listed.)

 User contributions:
Let trekkers upload new plant/animal photos with notes. Store locally first, sync
online later.

 Community forum:
Simple Q&A style board (can use Firebase later).

 Weather forecast:
Integrate with OpenWeatherMap API (online only).

 First aid guide:


Offline HTML/Markdown stored in assets, displayed in-app.

✅ Phase 10: Testing & Packaging

Final stage before release.

Steps:
1. Testing:

o Test offline (airplane mode).

o Test ML predictions with sample images.

o Test SOS on real devices.

2. Polish UI:

o Use Expo vector icons for navigation tabs.

o Add splash screen + app icon.

3. Build for release:

o Android:

o eas build -p android --profile preview

o iOS (if needed, on Mac):

o eas build -p ios

📌 Expected Outcome: An installable .apk that runs on your phone, fully offline for trekking.

🧩 Final Module Structure

SahyadriCompanion/

├── app/

│ └── (tabs)/

│ ├── identify.js

│ ├── map.js

│ ├── sos.js

│ ├── settings.js

│ └── _layout.tsx

├── src/

│ └── modules/

│ ├── IdentifyModule/
│ │ └── IdentificationService.js

│ ├── MapModule/

│ │ └── TrekData.json

│ ├── EmergencySOSModule/

│ │ └── ContactService.js

│ └── SettingsModule/

│ └── PreferencesService.js

├── assets/

│ ├── model/

│ │ ├── model.tflite

│ │ └── labels.txt

│ └── treks/

│ └── sample_trek.gpx

 Front-end (Your App): The mobile app that the user interacts
with is built with JavaScript and TypeScript. All the code we've
written so far for the UI, navigation, and core offline features is in
these languages.

 ML Model (The "Backend"): The actual ML model that identifies


plants and animals would typically be trained using Python with
libraries like TensorFlow or PyTorch. After the model is trained in
Python, it's converted to a format like TensorFlow Lite (TFLite),
which is then bundled with your JavaScript app

You might also like