0% found this document useful (0 votes)
94 views8 pages

SRS Smart Traffic Signal Control System v2

The Software Requirements Specification (SRS) outlines the functional and non-functional requirements for a Smart Traffic Signal Control System using computer vision, designed to optimize traffic signal timings based on real-time video analysis. The document details the system architecture, user interfaces, performance metrics, and verification methods, targeting a prototype for proof-of-concept rather than production deployment. Key components include vehicle detection, traffic metrics computation, and emergency handling, with a focus on modular design for future enhancements.

Uploaded by

kanishkappandey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
94 views8 pages

SRS Smart Traffic Signal Control System v2

The Software Requirements Specification (SRS) outlines the functional and non-functional requirements for a Smart Traffic Signal Control System using computer vision, designed to optimize traffic signal timings based on real-time video analysis. The document details the system architecture, user interfaces, performance metrics, and verification methods, targeting a prototype for proof-of-concept rather than production deployment. Key components include vehicle detection, traffic metrics computation, and emergency handling, with a focus on modular design for future enhancements.

Uploaded by

kanishkappandey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

# Software Requirements Specification (SRS)

## Smart Traffic Signal Control System Using Computer Vision

**Document Version:** 2.0


**Date:** 2025-09-10
**Prepared by:** Project Team — Smart Traffic Signal Control (Lead: Kanishka Pandey)

---

## Revision History
- **v1.0** — Initial draft (core sections)
- **v1.1** — Expanded functional and non-functional requirements
- **v2.0** — Regenerated and substantially expanded to comply with IEEE 29148 guidelines; adds verification,
acceptance criteria, risk analysis, detailed interfaces, and traceability matrix.

---

## Table of Contents
1. Introduction
1.1 Purpose
1.2 Scope
1.3 Definitions, Acronyms, and Abbreviations
1.4 References
1.5 Overview
2. Overall Description
2.1 Product Perspective
2.2 System Interfaces
2.3 User Classes and Characteristics
2.4 Operating Environment
2.5 Design and Implementation Constraints
2.6 Assumptions and Dependencies
3. Specific Requirements
3.1 Functional Requirements (FR)
3.2 External Interface Requirements
3.3 Performance Requirements
3.4 Logical Database Requirements
3.5 Safety and Regulatory Requirements
3.6 Software Quality Attributes (Non-functional Requirements)
4. System Architecture and Design Overview
4.1 Component Descriptions
4.2 Data Flow and Control Flow
4.3 Deployment View
5. Use Cases and Scenarios
5.1 Primary Use Cases
5.2 Sequence Diagrams
6. Verification and Validation
6.1 Acceptance Criteria
6.2 Test Plan Overview
6.3 Traceability Matrix
7. Operation and Maintenance
7.1 Installation and Deployment
7.2 Backup and Recovery
7.3 Monitoring and Diagnostics
8. Project Deliverables, Schedule and Budget
9. Risks, Mitigation and Safety Analysis
10. Appendices
A. Data Dictionary
B. Configuration Parameters
C. Sample Message Formats
D. Glossary
---

## 1. Introduction

### 1.1 Purpose


This Software Requirements Specification (SRS) defines the functional and non-functional requirements for a
prototype Smart Traffic Signal Control System using computer vision. The system adaptively adjusts traffic
signal timings in near-real time by analyzing CCTV video streams, estimating per-lane traffic demand, and
computing optimized signal plans for intersections in the Delhi–NCR region. This document is intended to guide
design, implementation, testing, and acceptance of the prototype.

### 1.2 Scope


The project delivers a software prototype and simulation platform capable of:
- Ingesting live or recorded CCTV video feeds.
- Detecting and tracking vehicles, classifying vehicle types, assigning vehicles to lanes.
- Computing adaptive signal timings respecting safety bounds and pedestrian needs.
- Applying timing plans to simulated controllers (and optionally to physical controllers during field trials).
- Providing a web-based operator dashboard for monitoring, manual overrides, and replay of simulation data.
- Logging events and metrics for analysis.

The prototype targets proof-of-concept demonstration and evaluation rather than production deployment;
however, the design supports extension to real-world trial deployments.

### 1.3 Definitions, Acronyms, and Abbreviations


- **SRS** — Software Requirements Specification
- **ITS** — Intelligent Transportation System
- **RTSP** — Real Time Streaming Protocol
- **ROI** — Region of Interest
- **FR** — Functional Requirement
- **NFR** — Non-functional Requirement
- **FPS** — Frames Per Second
- **CPU/GPU** — Central Processing Unit / Graphics Processing Unit

### 1.4 References


- IEEE Std 29148-2011: Requirements Engineering
- IEEE Std 1016-2009: Software Design Description
- IEEE Std 829-2008: Software and System Test Documentation
- OpenCV documentation (local copy)
- Internal project bibliography (papers on adaptive signal control and CV methods)

### 1.5 Overview


The remainder of this document expands on system context, detailed functional and non-functional requirements,
test and verification methods, deployment considerations, and appendices containing message schemas and
configuration parameters.

---

## 2. Overall Description

### 2.1 Product Perspective


The product is a software module intended to integrate with existing traffic infrastructure. It sits between
sensing (CCTV cameras) and actuation (traffic signal controllers) and provides a closed-loop control function
that continuously updates phase timings. The architecture allows both standalone simulation runs (for
experiments) and live runs where outputs are transmitted to hardware controllers. The system adheres to
modular principles to ease future upgrades (e.g., replacing detection models, changing optimization
algorithms, adding sensors).

### 2.2 System Interfaces


- **Camera Streams:** RTSP/HTTP/ONVIF streams providing video frames.
- **Controller Interface:** Serial (RS-232/485) or TCP/IP JSON commands to signal controllers, or simulated
endpoints in simulation mode.
- **Operator UI:** Web-based dashboard communicating with backend via REST API and websockets for live
updates.
- **Storage:** SQL database for logs and configuration; object store for video clips and model artifacts.

Detailed example message samples and schemas are included in Appendix C.

### 2.3 User Classes and Characteristics


- **Traffic Operator:** Uses the UI to monitor intersections, respond to alerts, and perform manual overrides.
Must have moderate technical familiarity.
- **System Administrator:** Installs, configures, and maintains software and database; performs backups and
updates.
- **Researcher/Data Analyst:** Runs experiments, exports data, and adjusts model parameters.
- **Emergency Services Contact:** Coordinates emergency preemption rules and policies.

Each user class will have role-based access and different UI permissions.

### 2.4 Operating Environment


- Typical deployment: Linux server (Ubuntu 20.04 LTS or later) with optional NVIDIA GPU (CUDA-enabled) for
acceleration; Windows support is provided for development/test PCs.
- Network: Local area network (Ethernet/Wi-Fi); camera connectivity assumed on LAN. Optional cloud-hosted
analytics for large-scale experiments is supported as an extension.

### 2.5 Design and Implementation Constraints


- Project completion target: 3 months from project start.
- Budget: ■20 lakhs maximum.
- Use of open-source CV libraries (OpenCV, PyTorch/TensorFlow) is preferred.
- The system will not be certified as a safety-critical traffic controller; live deployments must follow local
regulations and approvals.
- Minimize modifications to existing physical infrastructure; preference for software-only integration.

### 2.6 Assumptions and Dependencies


- Cameras provide sufficient resolution and field-of-view for lane-level detection.
- Stable network connectivity within the experiment environment is available.
- Required external libraries and runtime (e.g., CUDA drivers for GPU) can be installed on target machines.
- Emergency vehicle detection via camera visuals may be supplemented with external GPS or siren detection if
available.

---

## 3. Specific Requirements

### 3.1 Functional Requirements (Expanded)


The following functional requirements expand FR1–FR10 and introduce acceptance criteria for each.

**FR1 — Video Capture and Preprocessing**


- FR1.1: Support multiple simultaneous camera streams (min 8 streams per server in prototype).
*Acceptance Criteria:* System accepts and displays 8 RTSP streams simultaneously in the UI test harness.
- FR1.2: Frame sampling rate configurable (1–30 FPS).
*Acceptance Criteria:* Changing sampling rate updates processing load and frame rate within 2 seconds.
- FR1.3: Implement pre-processing: dewarping, perspective transform, ROI masking, brightness normalization.

**FR2 — Vehicle Detection and Tracking**


- FR2.1: Produce bounding boxes with class labels and detection confidences per frame.
*Acceptance Criteria:* On test dataset (daylight), detection precision & recall ≥ 90%.
- FR2.2: Maintain object tracks across frames with unique IDs and compute per-vehicle metrics (entry time,
exit time, speed estimate).
- FR2.3: Map detections to lane polygons defined in configuration.

**FR3 — Traffic Metrics Computation**


- FR3.1: Compute queue length per lane in vehicle count and metres, flow rate (vehicles per minute), and
average wait time.
- FR3.2: Provide sliding-window and per-cycle aggregations.
**FR4 — Timing Optimization**
- FR4.1: Implement at least two strategies: proportional allocation and heuristic predictive allocation
(short-term prediction using last N cycles).
*Acceptance Criteria:* Simulated results show reduced average delay relative to baseline fixed timings by ≥
10% during peak scenarios.
- FR4.2: Respect regulatory minimum/maximum green durations, amber times, and all-red safety intervals.

**FR5 — Signal Adjustment & Safety**


- FR5.1: Schedule and commit phase changes with safe transitions; support manual override and emergency
preemption.
- FR5.2: Provide logging of all phase changes with timestamps and source (automatic/manual/emergency).

**FR6 — Emergency Handling**


- FR6.1: Detect emergency vehicle presence and preempt phases to provide priority.
*Acceptance Criteria:* Emergency preemption executes within 3 seconds of detection in simulation.
- FR6.2: Provide policy-driven behavior (e.g., immediate stop, green extension) and recovery plan after
passage.

**FR7 — Pedestrian Management**


- FR7.1: Incorporate pedestrian call-in and minimum walk times into phase computations.

**FR8 — Operator UI & Controls**


- FR8.1: Dashboard shows multi-intersection view, per-intersection detail view, logs, and configuration
editor.
*Acceptance Criteria:* Operators can switch to manual override in < 5s and return to auto mode with
confirmation.

**FR9 — Logging & Data Export**


- FR9.1: Export logs and aggregated metrics in CSV/JSON; support scheduled exports.

**FR10 — Simulation Mode**


- FR10.1: Allow scenario creation (CSV arrival rates, vehicle mixes, time-of-day patterns) and accelerated
playback.
*Acceptance Criteria:* Simulation run reproduces metrics and produces reproducible logs for analysis.

**FR11 — Security & Access Control**


- FR11.1: Role-based authentication with configurable users and roles.
*Acceptance Criteria:* Unauthorized access is rejected and logged.

**FR12 — Maintenance & Diagnostics**


- FR12.1: Provide health-check endpoints and component-level diagnostics (camera connectivity, inference
latency, DB connectivity).

### 3.2 External Interface Requirements


**User Interface**
- Web UI composed of following pages: Login, Overview, Intersection Detail (with live video and metrics),
Simulation Control, Settings, Logs.
- Alert panel for warnings and errors.

**Hardware Interface**
- Camera: RTSP/ONVIF compliant.
- Controller: JSON-over-TCP or serial protocol (specifications in Appendix C). Supports commands: set_phase,
extend_phase, force_clear, heartbeat.

**API Interface**
- REST endpoints: /api/v1/status, /api/v1/intersections, /api/v1/control, /api/v1/logs
- WebSocket topic: /ws/metrics for streaming real-time metrics to UI.

### 3.3 Performance Requirements


- End-to-end detection latency: target ≤ 1.5 s per frame on CPU; ≤ 0.3 s on GPU.
- Decision latency (optimization computation): ≤ 0.5 s at cycle end.
- System throughput: support at least 5 intersections with 2 cameras each on a single server.
- Storage: capable of storing 30 days of aggregated metrics; raw video retention configurable (e.g. 24–72
hours) subject to disk capacity.

### 3.4 Logical Database Requirements


- Schema shall include tables: Intersections, Cameras, Lanes, Detections, Tracks, Cycles, Events, Users,
Configurations.
- Timestamp fields shall use ISO 8601 UTC format.

### 3.5 Safety and Regulatory Requirements


- All phase transitions must include amber and all-red intervals conforming to Delhi traffic regulations
(configurable parameters).
- System must never directly switch two conflicting phases to green simultaneously; conflict matrix enforced
in controller interface.

### 3.6 Software Quality Attributes (Expanded NFRs)


- **Reliability:** Automatic fallback to default fixed-time plan if telemetry is lost for > 30s.
- **Availability:** Target 99.5% uptime for experimental deployment.
- **Maintainability:** Modular code, documented APIs, and CI pipeline for builds and tests.
- **Security:** Password storage with salted hashing, HTTPS for remote UI access, audit logs for critical
actions.
- **Privacy:** Raw video stored only when needed for debugging and maintained per retention policy;
anonymization options to blur faces and license plates are provided as configuration.

---

## 4. System Architecture and Design Overview

### 4.1 Component Descriptions


- **Camera Connector:** Manages RTSP/HTTP streams; provides frames to buffer.
- **Preprocessor:** Corrects lens distortion, normalizes brightness, applies ROI masks and perspective
transforms.
- **Detector:** Runs object detection and returns bounding boxes and classes.
- **Tracker:** Associates detections across frames to create tracks and estimate speed and dwell time.
- **Lane Manager:** Maps tracks to lane polygons and computes per-lane occupation metrics.
- **Analytics Engine:** Aggregates metrics, computes moving averages, and prepares inputs for optimizer.
- **Timing Optimizer:** Implements algorithms to compute next cycle allocation; respects safety bounds and
policy weights.
- **Controller Adapter:** Formats and sends commands to signal controller devices or simulated endpoints.
- **UI/Backend:** REST API, websockets, and frontend application for operator interaction.
- **Storage Layer:** Relational DB for structured data and file/object store for video clips and model
artifacts.

### 4.2 Data Flow and Control Flow


- Frames flow from Camera Connector → Preprocessor → Detector → Tracker → Lane Manager → Analytics Engine →
Timing Optimizer → Controller Adapter.
- UI subscribes to Analytics Engine output via websockets for live visualization.

### 4.3 Deployment View


- Single-server prototype: all modules run on one machine (Docker containers recommended).
- Multi-server / scale-out: inference workers (GPU-enabled) process video frames; a coordinator service
aggregates metrics and runs global optimization.

PlantUML diagrams for components and sequence flows are included in Appendix D for easy rendering into black-
and-white graphics.

---

## 5. Use Cases and Scenarios

### 5.1 Primary Use Cases (brief)


- **UC-01:** Start Adaptive Mode — Operator starts the adaptive control for selected intersections.
- **UC-02:** Manual Override — Operator forces a lane green for a specified duration.
- **UC-03:** Emergency Preemption — System detects emergency vehicle and preempts normal control.
- **UC-04:** Run Simulation Scenario — Researcher runs a scenario and collects results.
- **UC-05:** View Historical Metrics — Analyst downloads CSV of last 24 hours of metrics.

### 5.2 Sequence Diagrams


(Sequence diagrams in Appendix D; include vehicle detection → optimizer → controller sequences.)

---

## 6. Verification and Validation

### 6.1 Acceptance Criteria (Summary)


- Detection accuracy: ≥ 90% precision & recall on daylight test set; ≥ 80% at night.
- Latency targets met for detection and decision stages as defined in Section 3.3.
- Simulation experiments show average delay reduction ≥ 10% against fixed-time baseline on at least two
representative peak scenarios.
- Manual override and emergency preemption operate correctly as demonstrated in test scenarios.
- UI role-based access control functions and logs authentication events.

### 6.2 Test Plan Overview


- **Unit Tests:** Core modules (lane assignment, optimizer) with coverage target ≥ 70%.
- **Integration Tests:** Camera → detection → optimizer → controller loop in simulation mode.
- **System Tests:** End-to-end tests on recorded video datasets and synthetic scenarios.
- **Performance Tests:** Load tests for multiple intersections and stress tests for component failures.
- **Acceptance Tests:** Defined scenarios run with stakeholders (traffic engineering faculty / simulated
operator) verifying KPIs.

### 6.3 Traceability Matrix (excerpt)


| Requirement | Test Case ID | Verification Method |
|---|---:|---|
| FR1 (Video Capture) | TC-01 | Integration test with RTSP streams |
| FR2 (Detection) | TC-02 | Comparison with ground-truth labels |
| FR4 (Optimization) | TC-05 | Simulation KPI comparison |
| FR6 (Emergency) | TC-09 | Simulated emergency vehicle run |

Full traceability matrix is maintained in the project test repository.

---

## 7. Operation and Maintenance

### 7.1 Installation and Deployment


- Use Docker Compose for single-node deployment with preconfigured containers: backend, frontend, detector
(inference), db, and controller-simulator.
- Provide installation script and checklist for system administrators.

### 7.2 Backup and Recovery


- Database nightly backups; configurable retention (default 30 days).
- Video clip retention configurable; recommended 72 hours.

### 7.3 Monitoring and Diagnostics


- Health endpoints for each module; system dashboard displays component statuses and recent errors.
- Alerts via email/slack for critical failures (DB down, controller unavailable).

---

## 8. Project Deliverables, Schedule and Budget


### Deliverables
- Software prototype and source code repository with CI pipeline.
- Documentation: SRS, Design Document (SDD), Test Plan (STD), User Manual.
- Demo scenarios and datasets.
- Final report and presentation.
### Schedule (3-month timeline — high level)
- Month 1: Requirements finalization, design, camera calibration tools.
- Month 2: Core implementation (detection, tracking, basic optimizer), UI skeleton.
- Month 3: Simulation harness, advanced optimization, testing, documentation.

### Budget (indicative)


- Hardware (server + GPU): ■8–10 lakhs
- Software & licences / cloud credits: ■1–2 lakhs
- Misc (data acquisition, travel, contingency): ■1–2 lakhs
- Personnel / stipends: remainder up to ■20 lakhs

---

## 9. Risks, Mitigation and Safety Analysis


| Risk | Likelihood | Impact | Mitigation |
|---|---:|---:|---|
| Insufficient camera coverage | Medium | High | Use recorded datasets and synthetic scenarios; plan minimal
camera calibration steps |
| Detection model performs poorly at night | High | Medium | Add night-time training data, enable IR or
alternate sensors if available |
| Network / controller failure | Medium | High | Fallback to fixed-time plans; operator alerts and manual
control |
| Regulatory approval delays | Low | High | Restrict prototype to simulation or controlled field trials with
permissions |

Safety: Ensure all automated commands pass a conflict-checking layer to prevent unsafe phase activations.

---

## 10. Appendices

### Appendix A — Data Dictionary (selected)


- **Detection:** {detection_id, frame_ts, cam_id, bbox_x, bbox_y, bbox_w, bbox_h, class, confidence}
- **Track:** {track_id, detection_ids, first_seen_ts, last_seen_ts, estimated_speed}
- **Cycle:** {cycle_id, intersection_id, start_ts, end_ts, applied_phase_plan}

### Appendix B — Configuration Parameters (selected)


- Default cycle length: 90 s
- Min green: 10 s
- Max green: 60 s
- Amber duration: 3 s (configurable per local regulation)
- All-red: 1–2 s (configurable)

### Appendix C — Sample Message Formats


- Controller command (JSON):
{\"intersection_id\":\"INT-01\",\"command\":\"set_phase\",\"phase_id\":\"P2\",\"duration_ms\":30000}
- Detection event (JSON): {\"cam_id\":\"CAM-01\",\"ts\":\"2025-09-10T10:00:00Z\",\"detections\":[{...}]}

### Appendix D — PlantUML Diagrams (for rendering in black & white)


```
@startuml
package "Smart Traffic System" {
[Camera(s)] --> [Acquisition Module]
[Acquisition Module] --> [Preprocessor]
[Preprocessor] --> [Detector]
[Detector] --> [Tracker]
[Tracker] --> [Lane Manager]
[Lane Manager] --> [Analytics Engine]
[Analytics Engine] --> [Timing Optimizer]
[Timing Optimizer] --> [Controller Interface]
[Controller Interface] --> [Traffic Controller]
[Analytics Engine] --> [Storage]
[Timing Optimizer] --> [UI/Backend]
[UI/Backend] --> [Operator]
}
@enduml
```

### Appendix E — Glossary


- **Adaptive Control:** Signal control that changes based on observed traffic demand.
- **Track:** A temporally associated sequence of detections representing the same physical vehicle.

---

*End of SRS — Version 2.0*

You might also like