-
-
Notifications
You must be signed in to change notification settings - Fork 3.3k
Travis optimization #2404
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Travis optimization #2404
Conversation
…se made the checker go into CPU overdrive
It worked? |
.travis.yml
Outdated
|
||
- stage: post-build | ||
name: Deploy | ||
if: branch = master |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This condition saves deploy.sh
internal condition, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll clean that up in a bit
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
.travis.yml
Outdated
name: Link check | ||
script: yarn linkcheck | ||
|
||
- stage: post-build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why it this one post build? We shouldn't build if this command fails.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can re-order as we see fit later. Right now I just need to get the pieces working, then we can discuss what the pipeline should do.
…. Try to centralize github API usage to avoid rate limitations
.travis.yml
Outdated
@@ -51,6 +51,7 @@ jobs: | |||
cache: | |||
pip: true | |||
directories: | |||
- dist |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be revisited. At this point it was already deployed, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I also forgot that the dist folder is actually not even checked with proselint at all. Just src/content
. So this is just useless
17c5d9f
to
fbd7a2f
Compare
This branch is ready to go! I've leveraged as much caching on Travis as possible, which have sped up yarn installs and proselint runs by a lot. I've switched the travis setup to run on the cheap and fast container setup, which means the test start faster. The lint+build stage runs all linting except proselint, which runs in parallel in a separate container. The lint+build stage saves build artefacts in a cache. After the build has succeeded the external link check runs on the build artefacts in a separate container. This is a non-blocking test run to avoid massive blockage of PR's based on external links changing. After the build has succeeded and if the build is of the The fetching of the repositories for loaders and plugins, which previously caused our github api usage to exeed the rate limit has now been removed from the build pipeline. Instead it is a script you will run manually as a maintainer or contributor ( All of the above means we can increase the number of parallel builds on Travis, which are currently limited to Latest build of this branch: https://travis-ci.org/webpack/webpack.js.org/builds/409489808 Build times are around 5 minutes for the blocking part and 4 minutes for the non-blocking part. I've set travis up to `fast_finish, which should give a green light in Github as soon as the blocking part of the tests are finished. |
Not that this PR merges to |
include: | ||
- stage: Build | ||
name: Lint and Build | ||
before_install: npm install --global yarn |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Travis will automatically use yarn
once it sees yarn.lock
file, we don't need to install it explicitly.
"test": "npm run lint", | ||
"lint": "run-s lint:*", | ||
"lint:js": "eslint src --ext .js,.jsx,.md", | ||
"lint:js": "eslint src --ext .js,.jsx,.md --cache true --cache-location .cache/.eslintcache", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice one! 👍🏽
|
||
await writeFile(fileName, headmatter + body); | ||
|
||
console.log('Generated:', path.relative(cwd, fileName)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't mean to sound too picky but perhaps we could add space after Generated:
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The console logging adds a space between arguments :)
fs.writeFile(jsonPath, json, (err) => { | ||
if (err) { | ||
throw err; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think it will make sense to add some basic logging here, the same as one in fetch-package-readmes
script?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would indeed make sense. I'd prefer if someone else adds that. I'm about to be offline for a week
Awesome work @Munter, can't wait to try this out! I like the idea of not having to fetch GH repos every 5 minutes and it's gonna solve the API limit issue finally 🎉 |
@jeremenichelli @EugeneHlushko Any feedback? I would like to merge this one. |
@Munter There is a cron job that deploys every day, How would this affect it? |
@montogeek since I kept the script names intact I think there will be no impact on any cron job |
@Munter Yep |
I will merge this one later today |
* Update link checking to use latest hyperlink. Replaces #1582 * Stop appending empty fragment to urls pointing ad no fragment identifier * Resolve relative hrefs from README's to their github.com url instead of linking locally inside docs * Prefer https where available and also use https in examples * Make non-fragment links inside headings visible * Update webpack docs china link * Fixed invalid edit links to index pages * skip fragment check for vimdoc that is known to fail * Fix broken link to migration guide from configuration/resolve * Also resolve github README relative links when they don't start with a dot * Add missing pages: Vote, Organization, Starter kits * Remove unused code * Fix link that was redirected * Open all target='_blank' links with rel='noopener' for security * Skip link checking img.shields.io * Change order of markdown url rewriting to avoid errors * Use lowercase variable name for node url module. See #2382 (comment) * Update link to file-loader github repo * Rewrite links from https://npmjs.com to https://www.npmjs.com to avoid the inevitable redirect * Skip checking links to npmjs.com/package that causes hyperlink to get http 500 responses. This should probably be reverted and fixed later * Updated organization projects urls to their redirect target * Update tool-list to get correct links for projects * Updated links to their redirect target * Remove jscs-loader. The project is deprecated, merged with eslint and the domain we link to is for sale * Update link to karma-webpack project * Remove link to named-modules-plugin, which is now in webpack core * Added missing mini-css-extract-plugin to fetch script * Skip checking links to known deleted github users * Update broken fragments to work with new markdown generator * Fixed more broken links * infra(site) Copy assets when building * Update github location for css-loader * Remove dead github accounts from contributors listings * infra(site) Fix fetchPackages script when running locally * Fix internal fragmtn links in optimization.md * Skip link checking or opencollective.com/webpack. Massive html response made the checker go into CPU overdrive * Remove link to non-existing named-modules-plugin * Use new release of hyperlink * feat(infra) Travis optimization (#2404) * Fix internal fragmtn links in optimization.md * Skip link checking or opencollective.com/webpack. Massive html response made the checker go into CPU overdrive * Try out travis staged build * Add proselint * Upgrade pip before using it * Move before-hooks around to try and make proselint install * Try adding python as a language as well to geet an updated version * More messing with config * Manually handle node versioning * Add node minor version to nvm install. Defaulted to slightly too low version * Manually install node 8.11 * Try a matrix build to separate node and python stuff * Add linkcheck task and try a different cahce setup * Minor name change to test if cache works correctly across builds * Attempt to combine matrix and staged build * Attempt going back to staged build * Bump travis * Ping Travis. You alive? * Fix broken travis.yml * Fix wrong stage order * Explicitly run specific lintings, exclude proselint * Allow failures for link checker * Change proselint cache directory. Maybe this works * Add new script to fetch repository names that the docs should contain. Try to centralize github API usage to avoid rate limitations * Add caching to eslint * Remove parts of deploy.sh that are now run by travis * Added new ./repositories to store github api results * Replace fetch.sh with npm scripts and fetch-package-readmes.js * Attempt to make caches more specific to the containers and stages they refer to * Remove deprecaed fetch_packages script. Now replaced by two-step fetch-package-repos and fetch-package-readmes * Attempt a more windows friendly build by using npm and webpack instead of shell commands * Disable link checking for now to speed up builds * Revert "Disable link checking for now to speed up builds" This reverts commit 7d4bb06. * Add dist to proselint cache so generated content also gets checked * Remove unnessessary GITHUB_TOKEN env variable check in repo fetching script * Revert "Add dist to proselint cache so generated content also gets checked" This reverts commit fc7676d. * Rework pipeline for better speed and make proselint a deployment blocker * Rename build stage to reflect its actions * Run content-tree only after generating all external content * Remove link to non-existing named-modules-plugin * Fix double slashes in edit links * Rename stages * Add new internal link check as a blocking check * Fix wrong name in allowed_failures config
I've leveraged as much caching on Travis as possible, which have sped up yarn installs and proselint runs by a lot.
I've switched the travis setup to run on the cheap and fast container setup, which means the test start faster.
The lint+build stage runs all linting except proselint, which runs in parallel in a separate container.
The lint+build stage saves build artefacts in a cache.
After the build has succeeded the external link check runs on the build artefacts in a separate container. This is a non-blocking test run to avoid massive blockage of PR's based on external links changing.
After the build has succeeded and if the build is of the
master
branch, a deploy will be initialized with the cached artefacts from the build in a previous container.The fetching of the repositories for loaders and plugins, which previously caused our github api usage to exeed the rate limit has now been removed from the build pipeline. Instead it is a script you will run manually as a maintainer or contributor (
yarn update-repos
) , which will save the updated repos in version control. The build uses that list of repos to fetch the readmes without ever hitting the github api, and thus avoiding the rate limitation.All of the above means we can increase the number of parallel builds on Travis, which are currently limited to
1
. I think 4 is the maximum we can get, which will give us significant improvement on feedback time.Latest build of this branch: https://travis-ci.org/webpack/webpack.js.org/builds/409489808
Build times are around 5 minutes for the blocking part and 4 minutes for the non-blocking part. I've set travis up to `fast_finish, which should give a green light in Github as soon as the blocking part of the tests are finished.