Use single object spread call in loose mode#11520
Use single object spread call in loose mode#11520nicolo-ribaudo merged 3 commits intobabel:masterfrom
Conversation
| @@ -0,0 +1,43 @@ | |||
| "use strict"; | |||
| Object.defineProperty(Object.prototype, 'NOSET', { | |||
There was a problem hiding this comment.
These were polluting the shared context, causing the other tests to fail.
|
Build successful! You can test your changes in the REPL here: https://babeljs.io/repl/build/21864/ |
|
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. Latest deployment of this branch, based on commit 654e0a3:
|
There was a problem hiding this comment.
Comparing the CI build time between master and this PR, the unit test now costs 30% longer time. My gut is it comes from the cost of creating every single context and evaluating @babel/polyfill and babel-helpers-in-memory.
Instead of running common modules for every single exec test, can we leverage the cacheData option? We can generate code cache after running common modules, and supply it to every single exec test context.
Excellent idea! Done. This gave me a roughly 20% speedup. |
JLHwung
left a comment
There was a problem hiding this comment.
Makes every exec.js test run in a new VM context (necessary for the new tests)
Can we separate this into a new PR? After the code cache, running yarn jest object-rest-spread now costs 8 secs vs. 4 sec on master.
Note that we can circumvent this issue by adding some suffix.
Object.defineProperty(Object.prototype, `NOSET${__filename}`, {
get(value) {
// noop
},
});I am good on the loose mode changes, but we should see if we can do better on the exec context.
d1fcc8f to
654e0a3
Compare
Done. |
|
Thank you for making this change, excited to try it out once it is released :) |
Does 2 things:
exec.jstest run in a new VM context (necessary for the new tests)loosemode, making the output considerably smaller.