A site dedicated to creating FAIR datasets to share across bioenergy research centers (BRCs) and to the global research community.
- Github group
- Slack workspace
- Mailing list - Developers (google group)
- Contribution Guide
- Contributors (have emailed license agreement to Nathan Hillson)
- JBEI (lead) = Nathan Hillson ([email protected])
- GLBRC = Dirk Norman ([email protected])
- CABBI = Leslie Stoecker ([email protected])
- CBI = Stanton Martin ([email protected])
- Hector Plahar
- Nick Thrower
- Clint Cecil
From discussion on 01/30/2024:
- In scope
- Build a basic website on a server running at JBEI using a tech stack that is "modern" but also one one that is new to all of the technical personnel working on it.
- Use agreed upon processes, defined in the contribution guide.
- Include all tech components needed to be a database-driven site.
- Secrets management for database connectivity on the server.
- Out of scope
- Automation (CI/CD pipelines).
- Authentication and authorization within the application.
- Admin interface.
- Data import capabilities.
- Access to the server to deploy outside of JBEI users.
- Tech stack
- VM with nginx and Docker installed.
- Postgres database.
- Vue.js, node.js, express as language stack.
- Container-first approach for all components.
Prerequisites:
- Docker
- Docker Compose
- Node.js (version in .nvmrc), recommend using a version manager like nvm or asdf
- Postman is useful for testing the api.
The application is a monorepo with two main components. The client is a Vue.js application and the API is an Express application.
The following command will run a postgres container with the password mysecretpassword and expose the database on port 6432.
docker run --name postgres -e POSTGRES_PASSWORD=mysecretpassword -d -p 6432:5432 postgres
- Copy the
.env.samplefile to.envand fill in the environment variables. These can also be set as environment variables on your system. - Docker Compose:
- To run the application in production mode, run
docker-compose upin the root directory of the project. This will start the nginx server for the client, the express server for the API, and the Postgres database. - To run the application in development mode, run
docker compose -f docker-compose.dev.yml up --build --watch. This will start the client and API in development mode with hot reloading. - You can run
docker-compose downto stop the application and destroy the containers and volumes. - Running
docker-compose up --buildwill rebuild the containers and restart the application.
- To run the application in production mode, run
Tests use Vitest and run inside Docker containers. No database connection is required.
# Run API tests
docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest run
# Run client tests
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitest run
# Run a single test file
docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest run tests/services/githubService.test.js
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitest run src/__tests__/components/AuthorList.test.js
# Watch mode
docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitestSome stderr output (e.g. "Error during search", "Turnstile error") is expected — these are console.error calls from the application code exercised by error-path tests.
API tests use Supertest for route-level integration tests. Tests are organized under api/tests/ mirroring the source structure:
api/tests/
├── helpers/ # Shared test utilities (Express app factory)
├── models/ # Dataset model tests
├── routes/ # Route integration tests (dataset, message, schema)
├── services/ # Service unit tests (github, ICE, strategy manager)
├── utils/ # Utility unit tests (categories, markdown, turnstile)
└── setup.js # Test environment variables
Writing API tests:
-
All tests are CommonJS (matching the API codebase).
-
Vitest globals (
describe,it,expect,vi,beforeEach) are available without imports. -
vi.mock()does not reliably intercept CJSrequire()calls. To mock a dependency, mutate the shared module object instead:const myService = require("../../app/services/myService"); myService.someMethod = vi.fn();
This works because
require()returns the same cached object to all consumers. For this reason, source modules should avoid destructuring at import time (usemod.fn()instead ofconst { fn } = require(mod)). -
Route tests use Supertest with a lightweight Express app from
tests/helpers/createApp.js(no Sequelize sync or Swagger setup). -
Database calls are mocked by mutating
db.datasets.scopeanddb.sequelize.queryon the sharedrequire("../models")object.
Client tests use Vue Test Utils for component testing. Tests are organized under client/src/__tests__/:
client/src/__tests__/
├── components/ # Component unit tests (AuthorList, Footer, FacetFilters, etc.)
├── composables/ # Composable tests (useTurnstile)
├── router/ # Route definition tests
├── services/ # API service tests (Dataset, Message, Schema)
├── store/ # Pinia store tests (searchStore)
└── views/ # View tests (ContactView, versionComponentMap)
Writing client tests:
- Tests are ESM (matching the client codebase). Import vitest functions explicitly:
import { describe, it, expect, vi } from 'vitest'. vi.mock()works for ESM imports. Mock HTTP calls by mocking@/http-common.- Mount components with
@vue/test-utils. Stub child components and router as needed.
Run tests with a coverage report:
docker compose -f docker-compose.dev.yml run --rm --no-deps api npx vitest run --coverage
docker compose -f docker-compose.dev.yml run --rm --no-deps client npx vitest run --coverageCoverage is enforced at 80% for statements, branches, functions, and lines (configured in api/vitest.config.js and client/vitest.config.js). The CI workflow runs coverage on every pull request and will fail if thresholds are not met.
If you see errors like npx: not found or unexpected behavior when running tests, you may have a stale Docker image cached from a previous build. This can happen when switching between dev and production Dockerfiles, since Docker Compose reuses an existing image if the tag already matches.
To fix this, remove the old image and rebuild:
docker image rm bioenergyorg-client # or bioenergyorg-api
docker compose -f docker-compose.dev.yml build --no-cache client- run
docker compose run api node scripts/import_datafeeds.jsfrom the root folder of the project. - To redirect validation errors to a file, run
docker compose run api node scripts/import_datafeeds.js 2>&1 > import_datafeeds.txt - Under Windows PowerShell, use the following version of the above command to get a clean output file:
cmd /c "docker compose run api node scripts/import_datafeeds.js > import_datafeeds_after.txt 2>&1" - If you see warnings like
"VITE_*" variable is not set, add that variable to your local.envfile as an empty placeholder.
- https://expressjs.com/
- https://sequelize.org/
- https://vuejs.org/
- CABBI: https://cabbitools.igb.illinois.edu/brc/cabbi.json
- CBI: https://fair.ornl.gov/CBI/cbi.json
- GLBRC: https://fair-data.glbrc.org/glbrc.json
- JBEI: https://bioenergy.org/JBEI/jbei.json
Validating data against the BRC schema can be done with the LinkML framework.
- LinkML has a docker image available here: https://hub.docker.com/r/linkml/linkml
Note for Windows users: To run the validator script on Windows:
- First install WSL: wsl --install
- Then run the Ubuntu terminal: wsl -d Ubuntu
- Then follow the Unix instructions below.
- Note that WSL does not (by default) route traffic through VPNs. If you encounter connection timeouts when running this script under WSL, either disconnect from your VPN or follow these instructions: https://learn.microsoft.com/en-us/windows/wsl/troubleshooting#wsl-has-no-network-connectivity-once-connected-to-a-vpn
This process, including installing LinkML, can be done with the validation script in this repo:
./validate.shAlternatively, the process may be done manually:
- Install the LinkML Python package as detailed here.
- Retrieve a local copy of the data collection in JSON format. For example, run
wget https://bioenergy.org/JBEI/jbei.json - Retrieve the most recent version of the schema in YAML format. The schema is here: https://github.com/bioenergy-research-centers/brc-schema/blob/main/src/brc_schema/schema/brc_schema.yaml
- Run the following
linkmlcommand:linkml validate --schema brc_schema.yaml -C Dataset <datafile>, replacing<datafile>with the path to your data in JSON.- For example, a fully valid
jbei.jsonwill yield the following result:$ linkml validate --schema brc_schema.yaml -C Dataset jbei.json No issues found - Places where the data does not comply with the schema will be indicated like below:
$ linkml validate --schema src/brc_schema/schema/brc_schema.yaml -C Dataset jbei-bad.json [ERROR] [jbei-bad.json/0] Additional properties are not allowed ('DATE' was unexpected) in / [ERROR] [jbei-bad.json/0] 'date' is a required property in / [ERROR] [jbei-bad.json/1] 'yes' is not of type 'boolean', 'null' in /creator/0/primaryContact [ERROR] [jbei-bad.json/8] Additional properties are not allowed ('BRC' was unexpected) in / [ERROR] [jbei-bad.json/8] 'brc' is a required property in /
- For example, a fully valid
InterBRC Data Products Portal Copyright (c) 2025, The Regents of the University of California, through Lawrence Berkeley National Laboratory, and UT-Battelle LLC, through Oak Ridge National Laboratory (both subject to receipt of any required approvals from the U.S. Dept. of Energy), University of Wisconsin - Madison, University of Illinois Urbana - Champaign, and Michigan State University. All rights reserved.
If you have questions about your rights to use or distribute this software, please contact Berkeley Lab's Intellectual Property Office at [email protected].
NOTICE. This Software was developed under funding from the U.S. Department of Energy and the U.S. Government consequently retains certain rights. As such, the U.S. Government has been granted for itself and others acting on its behalf a paid-up, nonexclusive, irrevocable, worldwide license in the Software to reproduce, distribute copies to the public, prepare derivative works, and perform publicly and display publicly, and to permit others to do so.