Background
To do with project we branched off of the research currently being done in the Maryland Information & Network Dynamics (MIND) lab. In essence, Iribe has thousands of sensors that provide a variety of information that we would like to visualize using AR for maintenance purposes. We have access to over two years of sensor data as well as a 3D model of the building. Our project has two goals, to provide information about these sensors and also to provide a visualization for components like pipes and ducts, with the end goal being assistance of maintenance work
Utilities Used
To implement this project we utilized the Unity game engine along with the Microsoft Mixed Reality Toolkit (MRTK). For the majority of the project we used Unity 2020.3.34f with MRTK version 2.7.
While the project’s goal is to operate with Augmented Reality (AR) features, most development was done in a Virtual Reality environment (VR). Testing was done with both a Valve Index and iPhone 13 for iOS deployment. However, minimal testing was successful for a Microsoft HoloLens 2 as well.
We aimed to have a physical representation of mechanical equipment such as air supply duct work and water piping systems. Alongside virtual counterparts, we worked with symbolic representations of meter systems such as a Variable Air Volume (VAV) unit, which would not necessarily overlay the real-world counterparts.
Data has been collected from roughly 2019 to 2021 of most of Iribe’s internal sensors and systems. This data was then stored in a PostgreSQL database, which is managed by MIND lab. It was accessed via a Django-based front-end RESTful API. Graphing of data was done through the ScottPlot library.
Accomplishments
Interactable Querying with MIND Lab Database
We can currently query the database with varying parameters: meter ID’s, star and end dates, and desired interval for readings. Each of these parameters is displayed and interactable with the user through a variety of panels. However, our current system only has setup and experimented with a subset of VAV system meters – mainly temperature and airflow readings. These presented the most natural to graph over time, rather than occupancy statuses or valve openings. Additionally, VAV’s were the most bountiful within the database, whereby essentially every room had at least one VAV associated with it.

This panel was built utilizing basic MRTK buttons and an experimental step slider. It allows a user to enter the query parameters mentioned above in an interactive manner. The screenshot above will set the default query parameters for all VAV meters. However, a query panel is attached to each graph window as well.

The date picker group, which appears when selecting the start and end dates, was built from a variety of MRTK tools. No calendar-like tool already existed, so this was built upon MRTK slates, buttons, and scrollable objects. The days automatically switch when changing between months as well.

The VAV menu is a customized version of MRTK’s elastic panel. It utilizes MRTK’s grid collection, pressable buttons, and switches. Each VAV has an assortment of meters, numbering 10+. However, they are distinct groupings as well, such as temperature and airflow related meters. We’ve group them together here, where the switches change which meter should be queried for and displayed (allowing multiple at a time in a single graph).

We’ve created these VAV objects that would be ideally present in each room with a VAV unit. This is essentially a modified MRTK button. We’ve imbued these objects with pertinent information for this particular unit, such as the meters associated with it. When selecting it, either by ray casting or grabbing, a VAV menu is then created (or updated) with this VAV’s respective data.
Graphing of Meter Data
From the above query abilities, we receive back JSON data from the MIND Lab database with the desired meter readings. This data is then parsed into easier to handle data structures for graphing. We utilized ScottPlot graphing library to create the raw image bytes, which we then apply to a material within the graph window.

Above is an example of a graph window, with hourly temperature readings for a single day. It displays several meters: temperature, setpoint, and discharge temperature. As seen in the top right corner, a user can open the query parameters panel (the same design as attached to the VAV menu) to adjust them for this individual graph if desired. Additionally, the bottom right button allows the user to update the graph by re-querying the API for new data after changing query parameters.
The window itself is a modified MRTK Slate object. We’ve just added a graph mesh and supplementary button functionality to it. As stated above, we take the raw image bytes of the graph provided by ScottPlot and update the texture of the graph mesh.
Iribe Modeling
For this project we’ve been provided access to Iribe’s official architectural and mechanical models. These were created within Autodesk’s Revit software. We then filtered out individual systems (e.g. air supply and return duct work) and individually exported these objects as FBX files. Unity can then natively display these models within the scene, as shown above.

To interact with the models, we’ve created a “layer menu.” This is built on top of the MRTK near-menu. Each button is responsible for toggling on-off the models within the scene. This allows a user to quickly hide what is visible in order to focus on what they desire. Of course this can be expanded to additional systems the user may want to view in the future.
Various UI/UX Tooling

For ease, we’ve added MRTK’s hand-menu, opening on the ulnar side of a palm up hand. Within this menu we’ve added the VAV button, layers button, and map button. The VAV button opens up the VAV menu, either utilizing the last VAV object selected for data or remaining in an empty state. The layers button opens the menu mentioned in the Iribe Model section. The map button is unimplemented, but would potentially show a birds-eye view or “mini-map” view of the user’s area.

We’ve also brought in MRTK’s experimental Dock. While menus and graph windows can be put into “Follow-Mode,” we found it to become cluttered. As a solution, we found this tool which was then attached to the user. Additionally, we’ve made graph windows and VAV objects capable of being put into these docking stations, as a sort of inventory. This could be helpful when inspecting varying rooms or a wide variety of data and the user wants to refer back or quickly compare.
Demonstrations
VR Demo
iOS Demos
Contributions to the Project
While this project idea originated from the MIND Lab, we believe we’ve made progress to their desired end goals. With the current project, a user can walk around a floor of Iribe. Two ductwork systems were pulled from the provided Revit model (gaining working knowledge of Revit was a feat in itself). These systems and parts of the model can be dynamically toggled for viewing by the user. Additionally, a subset of meters can now be queried via interactive means. The meter data is then compiled into a graph for easy viewing. We believe we’ve matched the desired early goals of the group. Of course there is plenty of room for improvement and work to be done!
Future Work
First and foremost, future work includes testing with the future end-users of this application. While we think the current setup functions for us, the actual needs and interaction patterns may vary from our personal thoughts. In addition to user testing, we believe deployment to more devices may be prudent as well, to see which best fits the job according to the end-users. As such, while the project runs (mostly) on iOS and somewhat on HoloLens 2, there are clear areas that need to be changed in order to improve usability and experience.
Currently we have no way of localizing the user within Iribe. Their location is hard-coded into the Unity scene. We believe we could utilize spatial anchors, such as Azure Spacial Anchors, to locate the user within the building.
The project also needs to be expanded to the rest of Iribe. While the duct work and piping systems will likely be quick work, attaching data and positioning VAV objects through the building may be a time consuming process. It should be explored on if there is a programmatic way of placing meters and markings on the models exported from Revit into Unity.
Contributions
| NAME | CONTRIBUTIONS |
| William Ingold | Query & parse information from MIND Lab API and database. Create interactable query parameter panel. Create VAV elastic panel. Create interactable Graph Windows, with ScottPlot plotting and refreshable data. Create date picker group. Create VAV object. Some hand menu functionality. Learned Revit and exported necessary models. Formed interactions and relationships between each interactable object described above. VR demonstration. |
| Marko Neskovic | Layer menu with functioning toggling of walls and duct work systems. iOS debugging and deployment. Along with iOS demos. Early development for Iribe modeling testing. Prototype object-menu interactions |
| Logan Stevens | MRTK introductions & early usage Early virtual environment demo Embedding tools for data visualization |