Skip to content

After updating to v3.1, large repo build takes 3 hours #9754

@anaclumos

Description

@anaclumos

Have you read the Contributing Guidelines on issues?

Prerequisites

Description

~/Downloads/www  on   main *3 +9 !1  qqqq                                                                                                               128 ✘  took 56s   base   at 12:56:46 PM 
$ $npm_execpath run prepare-to-launch && $npm_execpath run scc && $npm_execpath run format && git add . && git commit -m 'wrote something' && git push && $npm_execpath run build && $npm_execpath redirects && until $npm_execpath ship; do :; done
$ $npm_execpath run clear && $npm_execpath run sanitize && $npm_execpath run process-blog && $npm_execpath run process-docs && $npm_execpath run backlinks && $npm_execpath run figcaption && $npm_execpath run readme
$ docusaurus clear && rm -rf 'blog' && rm -rf 'docs' && rm -rf '**/*.config.js' && rm -rf '**/*.config.js.map' && rm -f 'docusaurus.config.js.map' && rm -f 'docusaurus.config.js' && rm -rf 'i18n /**/*.md' && cp tools/안녕.md i18n/ko/docusaurus-plugin-content-docs/current/Hey.md && rm -rf 'i18n /**/*.png' && rm -rf 'i18n /**/*.svg' && rm -rf 'i18n /**/*.jpg' && rm -rf 'i18n /**/*.jpeg'
[SUCCESS] Removed the Webpack persistent cache folder at "/Users/cho/Downloads/www/node_modules/.cache".
$ python3 tools/sanitize.py
Found 2289 MD and MDX files.
Replaced 0 hex marks.
$ python3 tools/process-blog.py
$ python3 tools/process-docs.py
Replaced 10642 wikilinks.
$ python3 tools/process-backlinks.py
Found 4552 MD files.
Wrote 2860 files with 8283 mentions to backlinks.ts.
Wrote 2276 filenames to filenames.ts.
$ python3 tools/img-alt-to-figcaption.py
Found 2324 MD and MDX files.
Replaced 1320 alt texts.
$ cp tools/README.src.md README.md && printf "\n\n## Last updated \n\n$(date)\n" >> README.md
$ printf '\n## Stats\n' >> README.md && printf '\n```\n' >> README.md && scc . >> README.md && printf '```\n' >> README.md
$ prettier --log-level silent --config .prettierrc -w '**/*.{ts,tsx,json,md,mdx,css,scss,html,yml,yaml,mts,mjs,cts,cjs,js,jsx,xml}'
[main 478a064d] wrote something
 9 files changed, 13 insertions(+), 6 deletions(-)
 create mode 100644 Research/assets/F6FE2E.png
Enumerating objects: 30, done.
Counting objects: 100% (30/30), done.
Delta compression using up to 12 threads
Compressing objects: 100% (16/16), done.
Writing objects: 100% (16/16), 1.43 MiB | 33.99 MiB/s, done.
Total 16 (delta 6), reused 0 (delta 0), pack-reused 0
remote: Resolving deltas: 100% (6/6), completed with 6 local objects.
To https://github.com/anaclumos/extracranial.git
   f316c4bd..478a064d  main -> main
$ NODE_OPTIONS="--max-old-space-size=16384" docusaurus build
[INFO] Website will be built for all these locales:
- en
- ko
[INFO] [en] Creating an optimized production build...

✔ Client
  Compiled successfully in 1.30m

✔ Server
  Compiled successfully in 4.46m

[SUCCESS] Generated static files in "build".
[INFO] [ko] Creating an optimized production build...

✔ Client
  Compiled successfully in 1.35m

✔ Server
  Compiled successfully in 4.43m

[SUCCESS] Generated static files in "build/ko".
[INFO] Use `npm run serve` command to test your build locally.
$ cp _redirects build/_redirects
$ wrangler pages deploy ./build --commit-dirty=true --project-name=memex
🌎  Uploading... (16403/16403)

✨ Success! Uploaded 8158 files (8245 already uploaded) (155.03 sec)

✨ Uploading _redirects
✨ Deployment complete! Take a peek over at https://25a4b06d.memex.pages.dev
⌛ Done in 12618.98s.

    ~/Downloads/www  on   main *3 !2                                                                                                                     ✔  took 3h 30m 19s   base   at 04:48:29 PM

On the last line, take a look at 3h 30m 19s. Even though the client and server was compiled in ~4m, it just hangs there forever, and the node process takes ~7GB of RAM. In previous versions it went up to ~14GB; was there any change in how docusaurus limit RAM usage in sacrifice of compilation speed?

image

Reproducible demo

https://github.com/anaclumos/extracranial

Steps to reproduce

  1. Run all-in-one:build

Expected behavior

Compiles relatively fast, preferably under 30 minutes

Actual behavior

Takes 3 hours to build.

Your environment

  • Public source code:
  • Public site URL:
  • Docusaurus version used:
  • Environment name and version (e.g. Chrome 89, Node.js 16.4):
  • Operating system and version (e.g. Ubuntu 20.04.2 LTS):

Self-service

  • I'd be willing to fix this bug myself.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugAn error in the Docusaurus core causing instability or issues with its executiondomain: performanceRelated to bundle size or perf optimization

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions