Skip to content

bioenergy-research-centers/bioenergy.org

Repository files navigation

A site dedicated to creating FAIR datasets to share across bioenergy research centers (BRCs) and to the global research community.

Resources

Points of contact at each BRC

Tech contacts

  • Hector Plahar
  • Nick Thrower
  • Clint Cecil

MVP Product Definition

From discussion on 01/30/2024:

  • In scope
    • Build a basic website on a server running at JBEI using a tech stack that is "modern" but also one one that is new to all of the technical personnel working on it.
    • Use agreed upon processes, defined in the contribution guide.
    • Include all tech components needed to be a database-driven site.
    • Secrets management for database connectivity on the server.
  • Out of scope
    • Automation (CI/CD pipelines).
    • Authentication and authorization within the application.
    • Admin interface.
    • Data import capabilities.
    • Access to the server to deploy outside of JBEI users.
  • Tech stack
    • VM with nginx and Docker installed.
    • Postgres database.
    • Vue.js, node.js, express as language stack.
    • Container-first approach for all components.

Development

Prerequisites:

  • Docker
  • Docker Compose
  • Node.js (version in .nvmrc), recommend using a version manager like nvm or asdf
  • Postman is useful for testing the api.

The application is a monorepo with two main components. The client is a Vue.js application and the API is an Express application.

Running a postgres container

The following command will run a postgres container with the password mysecretpassword and expose the database on port 6432.

docker run --name postgres -e POSTGRES_PASSWORD=mysecretpassword -d -p 6432:5432 postgres

Running the application

  • Copy the .env.sample file to .env and fill in the environment variables. These can also be set as environment variables on your system.
  • Docker Compose:
    • To run the application in production mode, run docker-compose up in the root directory of the project. This will start the nginx server for the client, the express server for the API, and the Postgres database.
    • To run the application in development mode, run docker compose -f docker-compose.dev.yml up --build --watch. This will start the client and API in development mode with hot reloading.
    • You can run docker-compose down to stop the application and destroy the containers and volumes.
    • Running docker-compose up --build will rebuild the containers and restart the application.

Testing

Tests use Vitest and run inside Docker containers. No database connection is required.

# Run API tests
docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest run

# Run client tests
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitest run

# Run a single test file
docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest run tests/services/githubService.test.js
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitest run src/__tests__/components/AuthorList.test.js

# Watch mode
docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitest

Some stderr output (e.g. "Error during search", "Turnstile error") is expected — these are console.error calls from the application code exercised by error-path tests.

API tests

API tests use Supertest for route-level integration tests. Tests are organized under api/tests/ mirroring the source structure:

api/tests/
├── helpers/          # Shared test utilities (Express app factory)
├── models/           # Dataset model tests
├── routes/           # Route integration tests (dataset, message, schema)
├── services/         # Service unit tests (github, ICE, strategy manager)
├── utils/            # Utility unit tests (categories, markdown, turnstile)
└── setup.js          # Test environment variables

Writing API tests:

  • All tests are CommonJS (matching the API codebase).

  • Vitest globals (describe, it, expect, vi, beforeEach) are available without imports.

  • vi.mock() does not reliably intercept CJS require() calls. To mock a dependency, mutate the shared module object instead:

    const myService = require("../../app/services/myService");
    myService.someMethod = vi.fn();

    This works because require() returns the same cached object to all consumers. For this reason, source modules should avoid destructuring at import time (use mod.fn() instead of const { fn } = require(mod)).

  • Route tests use Supertest with a lightweight Express app from tests/helpers/createApp.js (no Sequelize sync or Swagger setup).

  • Database calls are mocked by mutating db.datasets.scope and db.sequelize.query on the shared require("../models") object.

Client tests

Client tests use Vue Test Utils for component testing. Tests are organized under client/src/__tests__/:

client/src/__tests__/
├── components/       # Component unit tests (AuthorList, Footer, FacetFilters, etc.)
├── composables/      # Composable tests (useTurnstile)
├── router/           # Route definition tests
├── services/         # API service tests (Dataset, Message, Schema)
├── store/            # Pinia store tests (searchStore)
└── views/            # View tests (ContactView, versionComponentMap)

Writing client tests:

  • Tests are ESM (matching the client codebase). Import vitest functions explicitly: import { describe, it, expect, vi } from 'vitest'.
  • vi.mock() works for ESM imports. Mock HTTP calls by mocking @/http-common.
  • Mount components with @vue/test-utils. Stub child components and router as needed.

Coverage

Run tests with a coverage report:

docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest run --coverage
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitest run --coverage

Coverage is enforced at 80% for statements, branches, functions, and lines (configured in api/vitest.config.js and client/vitest.config.js). The CI workflow runs coverage on every pull request and will fail if thresholds are not met.

Troubleshooting: stale Docker images

If you see errors like npx: not found or unexpected behavior when running tests, you may have a stale Docker image cached from a previous build. This can happen when switching between dev and production Dockerfiles, since Docker Compose reuses an existing image if the tag already matches.

To fix this, remove the old image and rebuild:

docker image rm bioenergyorg-client   # or bioenergyorg-api
docker compose -f docker-compose.dev.yml build --no-cache client

Import BRC Data Feeds

  • run docker compose run api node scripts/import_datafeeds.js from the root folder of the project.
  • To redirect validation errors to a file, run docker compose run api node scripts/import_datafeeds.js 2>&1 > import_datafeeds.txt
  • Under Windows PowerShell, use the following version of the above command to get a clean output file: cmd /c "docker compose run api node scripts/import_datafeeds.js > import_datafeeds_after.txt 2>&1"
  • If you see warnings like "VITE_*" variable is not set, add that variable to your local .env file as an empty placeholder.

Resources Used to Build This Application

BRC Data End Points

Validating Data

Validating data against the BRC schema can be done with the LinkML framework.

Note for Windows users: To run the validator script on Windows:

This process, including installing LinkML, can be done with the validation script in this repo:

./validate.sh

Alternatively, the process may be done manually:

  • Install the LinkML Python package as detailed here.
  • Retrieve a local copy of the data collection in JSON format. For example, run wget https://bioenergy.org/JBEI/jbei.json
  • Retrieve the most recent version of the schema in YAML format. The schema is here: https://github.com/bioenergy-research-centers/brc-schema/blob/main/src/brc_schema/schema/brc_schema.yaml
  • Run the following linkml command: linkml validate --schema brc_schema.yaml -C Dataset <datafile>, replacing <datafile> with the path to your data in JSON.
    • For example, a fully valid jbei.json will yield the following result:
      $ linkml validate --schema brc_schema.yaml -C Dataset jbei.json
      No issues found
      
    • Places where the data does not comply with the schema will be indicated like below:
      $ linkml validate --schema src/brc_schema/schema/brc_schema.yaml -C Dataset jbei-bad.json 
      [ERROR] [jbei-bad.json/0] Additional properties are not allowed ('DATE' was unexpected) in /
      [ERROR] [jbei-bad.json/0] 'date' is a required property in /
      [ERROR] [jbei-bad.json/1] 'yes' is not of type 'boolean', 'null' in /creator/0/primaryContact
      [ERROR] [jbei-bad.json/8] Additional properties are not allowed ('BRC' was unexpected) in /
      [ERROR] [jbei-bad.json/8] 'brc' is a required property in /
      

Copyright Notice

InterBRC Data Products Portal Copyright (c) 2025, The Regents of the University of California, through Lawrence Berkeley National Laboratory, and UT-Battelle LLC, through Oak Ridge National Laboratory (both subject to receipt of any required approvals from the U.S. Dept. of Energy), University of Wisconsin - Madison, University of Illinois Urbana - Champaign, and Michigan State University. All rights reserved.

If you have questions about your rights to use or distribute this software, please contact Berkeley Lab's Intellectual Property Office at [email protected].

NOTICE. This Software was developed under funding from the U.S. Department of Energy and the U.S. Government consequently retains certain rights. As such, the U.S. Government has been granted for itself and others acting on its behalf a paid-up, nonexclusive, irrevocable, worldwide license in the Software to reproduce, distribute copies to the public, prepare derivative works, and perform publicly and display publicly, and to permit others to do so.

About

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors