Update dependency @sparticuz/chromium to v143#158
Conversation
|
Warning Review the following alerts detected in dependencies. According to your organization's Security Policy, it is recommended to resolve "Warn" alerts. Learn more about Socket for GitHub.
|
StephenHeaps
left a comment
There was a problem hiding this comment.
brave/slim-list-lambda was updated to Node 24 here. Only sparticuz/chromium v143+ supports node 24.
76e372d to
220fed7
Compare
|
[puLL-Merge] - Sparticuz/chromium@v117.0.0..v143.0.0 Diffdiff --git .github/ISSUE_TEMPLATE/bug-report.md .github/ISSUE_TEMPLATE/bug-report.md
index d2ad908..ac3d63e 100644
--- .github/ISSUE_TEMPLATE/bug-report.md
+++ .github/ISSUE_TEMPLATE/bug-report.md
@@ -6,7 +6,7 @@ labels: bug
---
<!---
-For Chromium-specific bugs, please refer to: https://bugs.chromium.org/p/chromium
+For Chromium-specific bugs, please refer to: https://issues.chromium.org/issues?q=status:open%20componentid:1456776
For Puppeteer-specific bugs, please refer to: https://github.com/puppeteer/puppeteer/issues
For Playwright-specific bugs, please refer to: https://github.com/microsoft/playwright/issues
-->
@@ -15,8 +15,9 @@ For Playwright-specific bugs, please refer to: https://github.com/microsoft/play
- `chromium` Version:
- `puppeteer` / `puppeteer-core` Version:
-- Node.js Version: <!-- 16.x | 18.x -->
-- Lambda / GCF Runtime: <!-- `nodejs16` | `nodejs18.x` -->
+- Node.js Version: <!-- 20.x | 22.x -->
+- Lambda / GCF Runtime: <!-- `nodejs20` | `nodejs22.x` -->
+- Runtime Architecture: <!-- `x64` | `arm64` -->
## Expected Behavior
diff --git .github/workflows/release.yml .github/workflows/release.yml
index c8d1e29..f183e9c 100644
--- .github/workflows/release.yml
+++ .github/workflows/release.yml
@@ -13,25 +13,34 @@ jobs:
# Install jq so I can edit package.json from the command line
- run: sudo apt-get install jq -y
- - uses: actions/checkout@v3
+ - uses: actions/checkout@v6
- name: Setup Node.js
- uses: actions/setup-node@v3
+ uses: actions/setup-node@v5
with:
- node-version: 18.x
+ node-version: 24.x
registry-url: https://registry.npmjs.org/
- run: npm ci
- run: npm run build
+ - name: Copy x64 binaries to bin
+ run: cp -R bin/x64/* bin
+
- name: Release chromium on npmjs
run: npm publish
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }}
- - name: Create Lambda Layer
- run: make chromium-${{ github.ref_name }}-layer.zip
+ - name: Cleanup bin after npm publish
+ run: rm -f bin/chromium.br bin/swiftshader.tar.br
+
+ - name: Create Lambda Layer (x64)
+ run: make chromium-${{ github.ref_name }}-layer.x64.zip
+
+ - name: Create Lambda Layer (arm64)
+ run: make chromium-${{ github.ref_name }}-layer.arm64.zip
# Change the package name to chromium-min,
# delete the bin folder from the files array
@@ -39,7 +48,7 @@ jobs:
- name: Cleanup and prepare for chromium-min
run: |
jq '.name="@sparticuz/chromium-min"' package.json > .package.json
- jq 'del(.files[] | select(. == "bin"))' .package.json > package.json
+ jq 'del(.files[] | select(. == "bin" or . == "!bin/arm64" or . == "!bin/x64"))' .package.json > package.json
jq '.homepage="https://github.com/Sparticuz/chromium#-min-package"' package.json > .package.json
mv .package.json package.json
rm package-lock.json
@@ -50,27 +59,32 @@ jobs:
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }}
- - name: Create Chromium Pack
+ - name: Create Chromium (x64) Pack
run: |
- cd bin
- tar -cvf chromium-${{ github.ref_name }}-pack.tar *
- mv chromium-${{ github.ref_name }}-pack.tar ..
- cd ..
+ npm run pack:x64
+ mv chromium-pack.x64.tar chromium-${{ github.ref_name }}-pack.x64.tar
+
+ - name: Create Chromium (arm64) Pack
+ run: |
+ npm run pack:arm64
+ mv chromium-pack.arm64.tar chromium-${{ github.ref_name }}-pack.arm64.tar
- name: Upload items to Github Release
- uses: ncipollo/release-action@v1.12.0
+ uses: ncipollo/release-action@v1.20.0
with:
tag: ${{ github.ref_name }}
body: |
# [@sparticuz/chromium ${{ github.ref_name }}](https://www.npmjs.com/package/@sparticuz/chromium), [@sparticuz/chromium-min ${{ github.ref_name }}](https://www.npmjs.com/package/@sparticuz/chromium-min)
- The `chromium-${{ github.ref_name }}-layer.zip` file may be uploaded directly as a layer in AWS Lambda using the following code
+ The `chromium-${{ github.ref_name }}-layer.ARCH.zip` file may be uploaded directly as a layer in AWS Lambda using the following code
```
- bucketName="chromiumUploadBucket" && \
- aws s3 cp chromium-${{ github.ref_name }}-layer.zip "s3://${bucketName}/chromiumLayers/chromium-${{ github.ref_name }}-layer.zip" && \
- aws lambda publish-layer-version --layer-name chromium --description "Chromium ${{ github.ref_name }}" --content "S3Bucket=${bucketName},S3Key=chromiumLayers/chromium-${{ github.ref_name }}-layer.zip" --compatible-runtimes nodejs --compatible-architectures x86_64
+ bucketName="chromiumUploadBucket" && archType="x64" && \
+ aws s3 cp chromium-${{ github.ref_name }}-layer.${archType}.zip "s3://${bucketName}/chromiumLayers/chromium-${{ github.ref_name }}-layer.${archType}.zip" && \
+ aws lambda publish-layer-version --layer-name chromium --description "Chromium ${{ github.ref_name }}" --content "S3Bucket=${bucketName},S3Key=chromiumLayers/chromium-${{ github.ref_name }}-layer.${archType}.zip" --compatible-runtimes "nodejs20.x" "nodejs22.x" --compatible-architectures $(if [ "$archType" = "x64" ]; then echo "x86_64"; else echo "$archType"; fi)
```
- The `chromium-${{ github.ref_name }}-pack.tar` file may be uploaded to any https endpoint and the remote location may be used as the `input` variable in the `chromium.executablePath(input)` function.
- artifacts: "chromium-${{ github.ref_name }}-layer.zip,chromium-${{ github.ref_name }}-pack.tar"
+ The `chromium-${{ github.ref_name }}-pack.ARCH.tar` file may be uploaded to any https endpoint and the remote location may be used as the `input` variable in the `chromium.executablePath(input)` function.
+
+ Support this project's continued development by becoming a [monthly sponsor on GitHub](https://github.com/sponsors/Sparticuz). Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
+ artifacts: "chromium-${{ github.ref_name }}-layer.x64.zip,chromium-${{ github.ref_name }}-layer.arm64.zip,chromium-${{ github.ref_name }}-pack.x64.tar,chromium-${{ github.ref_name }}-pack.arm64.tar"
prerelease: false
draft: true
generateReleaseNotes: true
diff --git a/.github/workflows/test-arm.yml b/.github/workflows/test-arm.yml
new file mode 100644
index 0000000..19c00ff
--- /dev/null
+++ .github/workflows/test-arm.yml
@@ -0,0 +1,88 @@
+name: AWS Lambda CI (arm64)
+
+on:
+ push:
+ branches: [master]
+ pull_request:
+ branches: [master]
+
+jobs:
+ build:
+ name: Build Lambda Layer
+ runs-on: ubuntu-24.04-arm
+ permissions:
+ # Required to checkout the code
+ contents: read
+ # Required to put a comment into the pull-request
+ pull-requests: write
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v6
+
+ - name: Setup Node.js
+ uses: actions/setup-node@v5
+ with:
+ node-version: 24.x
+
+ - name: Install Packages
+ run: npm ci
+
+ - name: Run Source Tests
+ run: npm run test:source
+ continue-on-error: true
+
+ - name: "Report Coverage"
+ uses: davelosert/vitest-coverage-report-action@v2
+
+ - name: Compile Typescript
+ run: npm run build
+
+ - name: Create Lambda Layer
+ run: make chromium.arm64.zip
+
+ - name: Upload Layer Artifact
+ uses: actions/upload-artifact@v5
+ with:
+ name: chromium.arm64.zip
+ path: chromium.arm64.zip
+
+ execute:
+ name: Lambda (Node ${{ matrix.version }}.x)
+ needs: build
+ runs-on: ubuntu-24.04-arm
+ strategy:
+ matrix:
+ event:
+ - example.com
+ version:
+ - 20
+ - 22
+ - 24
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v6
+
+ - name: Setup Python
+ uses: actions/setup-python@v6
+ with:
+ python-version: 3.13
+
+ - name: Setup AWS SAM CLI
+ uses: aws-actions/setup-sam@v2
+
+ - name: Download Layer Artifact
+ uses: actions/download-artifact@v6
+ with:
+ name: chromium.arm64.zip
+
+ - name: Provision Layer
+ run: unzip chromium.arm64.zip -d _/amazon/code
+
+ - name: Install test dependencies
+ run: npm install --prefix _/amazon/handlers puppeteer-core --bin-links=false --fund=false --omit=optional --omit=dev --package-lock=false --save=false
+
+ - name: Patch template.yml file with arm64 support
+ run: sed -i 's/x86_64/arm64/g' _/amazon/template.yml
+
+ - name: Invoke Lambda on SAM
+ run: sam local invoke --template _/amazon/template.yml --event _/amazon/events/${{ matrix.event }}.json node${{ matrix.version }} 2>&1 | (grep 'Error' && exit 1 || exit 0)
diff --git .github/workflows/aws.yml .github/workflows/test-x64.yml
similarity index 58%
rename from .github/workflows/aws.yml
rename to .github/workflows/test-x64.yml
index 5753ccc..e4501ac 100644
--- .github/workflows/aws.yml
+++ .github/workflows/test-x64.yml
@@ -1,4 +1,4 @@
-name: AWS Lambda CI
+name: AWS Lambda CI (x64)
on:
push:
@@ -10,29 +10,41 @@ jobs:
build:
name: Build Lambda Layer
runs-on: ubuntu-latest
+ permissions:
+ # Required to checkout the code
+ contents: read
+ # Required to put a comment into the pull-request
+ pull-requests: write
steps:
- name: Checkout
- uses: actions/checkout@v3
+ uses: actions/checkout@v6
- name: Setup Node.js
- uses: actions/setup-node@v3
+ uses: actions/setup-node@v5
with:
- node-version: 18.x
+ node-version: 24.x
- name: Install Packages
run: npm ci
+ - name: Run Source Tests
+ run: npm run test:source
+ continue-on-error: true
+
+ - name: "Report Coverage"
+ uses: davelosert/vitest-coverage-report-action@v2
+
- name: Compile Typescript
run: npm run build
- name: Create Lambda Layer
- run: make chromium.zip
+ run: make chromium.x64.zip
- name: Upload Layer Artifact
- uses: actions/upload-artifact@v3
+ uses: actions/upload-artifact@v5
with:
- name: chromium
- path: chromium.zip
+ name: chromium.x64.zip
+ path: chromium.x64.zip
execute:
name: Lambda (Node ${{ matrix.version }}.x)
@@ -43,27 +55,29 @@ jobs:
event:
- example.com
version:
- - 16
- - 18
+ - 20
+ - 22
+ - 24
+
steps:
- name: Checkout
- uses: actions/checkout@v3
+ uses: actions/checkout@v6
- name: Setup Python
- uses: actions/setup-python@v4
+ uses: actions/setup-python@v6
with:
- python-version: "3.x"
+ python-version: 3.13
- name: Setup AWS SAM CLI
uses: aws-actions/setup-sam@v2
- name: Download Layer Artifact
- uses: actions/download-artifact@v3
+ uses: actions/download-artifact@v6
with:
- name: chromium
+ name: chromium.x64.zip
- name: Provision Layer
- run: unzip chromium.zip -d _/amazon/code
+ run: unzip chromium.x64.zip -d _/amazon/code
- name: Install test dependencies
run: npm install --prefix _/amazon/handlers puppeteer-core --bin-links=false --fund=false --omit=optional --omit=dev --package-lock=false --save=false
diff --git .gitignore .gitignore
index 4c04346..4b56d93 100644
--- .gitignore
+++ .gitignore
@@ -4,6 +4,7 @@
*.zip
bin/chromium-*.br
build
+coverage
node_modules
nodejs
_/amazon/samconfig.toml
@@ -12,3 +13,5 @@ _/amazon/.aws-sam
*.tgz
examples/**/package-lock.json
examples/**/.serverless
+docker
+fonts/fonts
diff --git Makefile Makefile
index 9872498..8249eb0 100644
--- Makefile
+++ Makefile
@@ -1,24 +1,49 @@
.PHONY: clean
+ARCH = $(shell uname -m | sed 's/x86_64/x64/' | sed 's/aarch64/arm64/')
+
clean:
rm -rf chromium.zip _/amazon/code/nodejs _/amazon/handlers/node_modules
pretest:
- unzip chromium.zip -d _/amazon/code
+ unzip chromium.$(ARCH).zip -d _/amazon/code
npm install --prefix _/amazon/handlers puppeteer-core@latest --bin-links=false --fund=false --omit=optional --omit=dev --package-lock=false --save=false
test:
- sam local invoke --template _/amazon/template.yml --event _/amazon/events/example.com.json node18
+ sam local invoke --template _/amazon/template.yml --event _/amazon/events/example.com.json node22
+
+test20:
+ sam local invoke --template _/amazon/template.yml --event _/amazon/events/example.com.json node20
+
+presource:
+ cp -R bin/$(ARCH)/* bin
+
+postsource:
+ rm bin/chromium.br bin/al2023.tar.br bin/swiftshader.tar.br
-.fonts.zip:
- zip -9 --filesync --move --recurse-paths .fonts.zip .fonts/
+%.x64.zip:
+ npm install --fund=false --package-lock=false
+ npm run build
+ mkdir -p nodejs
+ npm install --prefix nodejs/ tar-fs@3.1.1 follow-redirects@1.15.11 --bin-links=false --fund=false --omit=optional --omit=dev --package-lock=false --save=false
+ cp -R bin/x64/* bin
+ npm pack
+ rm bin/chromium.br bin/al2023.tar.br bin/swiftshader.tar.br
+ mkdir -p nodejs/node_modules/@sparticuz/chromium/
+ tar --directory nodejs/node_modules/@sparticuz/chromium/ --extract --file sparticuz-chromium-*.tgz --strip-components=1
+ npx clean-modules --directory nodejs "**/*.d.ts" "**/@types/**" "**/*.@(yaml|yml)" --yes
+ rm sparticuz-chromium-*.tgz
+ mkdir -p $(dir $@)
+ zip -9 --filesync --move --recurse-paths $@ nodejs
-%.zip:
+%.arm64.zip:
npm install --fund=false --package-lock=false
npm run build
mkdir -p nodejs
- npm install --prefix nodejs/ tar-fs@2.1.1 follow-redirects@1.15.2 --bin-links=false --fund=false --omit=optional --omit=dev --package-lock=false --save=false
+ npm install --prefix nodejs/ tar-fs@3.1.1 follow-redirects@1.15.11 --bin-links=false --fund=false --omit=optional --omit=dev --package-lock=false --save=false
+ cp -R bin/arm64/* bin
npm pack
+ rm bin/chromium.br bin/al2023.tar.br bin/swiftshader.tar.br
mkdir -p nodejs/node_modules/@sparticuz/chromium/
tar --directory nodejs/node_modules/@sparticuz/chromium/ --extract --file sparticuz-chromium-*.tgz --strip-components=1
npx clean-modules --directory nodejs "**/*.d.ts" "**/@types/**" "**/*.@(yaml|yml)" --yes
@@ -26,4 +51,18 @@ test:
mkdir -p $(dir $@)
zip -9 --filesync --move --recurse-paths $@ nodejs
-.DEFAULT_GOAL := chromium.zip
+pack-x64:
+ cd bin/x64 && \
+ cp ../fonts.tar.br . && \
+ tar -cvf chromium-pack.x64.tar al2023.tar.br chromium.br fonts.tar.br swiftshader.tar.br && \
+ rm fonts.tar.br && \
+ mv chromium-pack.x64.tar ../..
+
+pack-arm64:
+ cd bin/arm64 && \
+ cp ../fonts.tar.br . && \
+ tar -cvf chromium-pack.arm64.tar al2023.tar.br chromium.br fonts.tar.br swiftshader.tar.br && \
+ rm fonts.tar.br && \
+ mv chromium-pack.arm64.tar ../..
+
+.DEFAULT_GOAL := chromium.x64.zip
diff --git README.md README.md
index 2b7a735..b6bcd30 100644
--- README.md
+++ README.md
@@ -1,72 +1,79 @@
# @sparticuz/chromium
[](https://www.npmjs.com/package/@sparticuz/chromium)
-[](bin/)
+[](bin/)
[](https://www.npmjs.com/package/@sparticuz/chromium)
[](https://www.npmjs.com/package/@sparticuz/chromium-min)
+[](https://github.com/Sparticuz/chromium/releases)
[](https://paypal.me/sparticuz)
-## Chromium for Serverless platforms
+## Chromium for Serverless Platforms
[sparticuz/chrome-aws-lambda](https://github.com/sparticuz/chrome-aws-lambda) was originally forked from [alixaxel/chrome-aws-lambda#264](https://github.com/alixaxel/chrome-aws-lambda/pull/264).
-The biggest difference, besides the chromium version, is the inclusion of some code from https://github.com/alixaxel/lambdafs, as well as dropping that as a dependency. Due to some changes in WebGL, the files in bin/swiftshader.tar.br need to be extracted to `/tmp` instead of `/tmp/swiftshader`. This necessitated changes in lambdafs.
-However, it quickly became difficult to maintain because of the pace of `puppeteer` updates. This package, `@sparticuz/chromium`, is not chained to `puppeteer` versions, but also does not include the overrides and hooks that the original package contained. It is only `chromium`, as well as the special code needed to decompress the brotli package, and a set of predefined arguments tailored to serverless usage.
+The main difference, aside from the Chromium version, is the inclusion of some code from https://github.com/alixaxel/lambdafs, while removing it as a dependency. Due to changes in WebGL, the files in `bin/swiftshader.tar.br` must now be extracted to `/tmp` instead of `/tmp/swiftshader`. This required changes in lambdafs.
+
+However, maintaining the package became difficult due to the rapid pace of `puppeteer` updates. `@sparticuz/chromium` is not tied to specific `puppeteer` versions and does not include the overrides and hooks found in the original package. It provides only Chromium, the code required to decompress the Brotli package, and a set of predefined arguments tailored for serverless environments.
## Install
-[`puppeteer` ships with a preferred version of `chromium`](https://pptr.dev/faq/#q-why-doesnt-puppeteer-vxxx-work-with-chromium-vyyy). In order to figure out what version of `@sparticuz/chromium` you will need, please visit [Puppeteer's Chromium Support page](https://pptr.dev/chromium-support).
+[`puppeteer` ships with a preferred version of `chromium`](https://pptr.dev/faq#q-why-doesnt-puppeteer-vxxx-work-with-a-certain-version-of-chrome-or-firefox). To determine which version of `@sparticuz/chromium` you need, visit the [Puppeteer Chromium Support page](https://pptr.dev/chromium-support).
-> For example, as of today, the latest version of `puppeteer` is `18.0.5`. The latest version of `chromium` stated on `puppeteer`'s support page is `106.0.5249.0`. So you need to install `@sparticuz/chromium@106`.
+> For example, as of today, the latest version of `puppeteer` is `18.0.5`, and the latest supported version of Chromium is `106.0.5249.0`. Therefore, you should install `@sparticuz/chromium@106`.
```shell
# Puppeteer or Playwright is a production dependency
npm install --save puppeteer-core@$PUPPETEER_VERSION
-# @sparticuz/chromium can be a DEV dependency IF YOU ARE USING A LAYER, if you are not using a layer, use as a production dependency!
+# @sparticuz/chromium can be a DEV dependency IF YOU ARE USING A LAYER. If you are not using a layer, use it as a production dependency!
npm install --save-dev @sparticuz/chromium@$CHROMIUM_VERSION
```
-If your vendor does not allow large deploys (`chromium.br` is 50+ MB), you'll need to host the `chromium-v#-pack.tar` separately and use the [`@sparticuz/chromium-min` package](https://github.com/Sparticuz/chromium#-min-package).
+If your vendor does not allow large deployments (since `chromium.br` is over 50 MB), you will need to host the `chromium-v#-pack.tar` separately and use the [`@sparticuz/chromium-min` package](https://github.com/Sparticuz/chromium#-min-package).
```shell
npm install --save @sparticuz/chromium-min@$CHROMIUM_VERSION
```
-If you wish to install an older version of Chromium, take a look at [@sparticuz/chrome-aws-lambda](https://github.com/Sparticuz/chrome-aws-lambda#versioning) or [@alixaxel/chrome-aws-lambda](https://github.com/alixaxel/chrome-aws-lambda).
+If you need to install an older version of Chromium, see [@sparticuz/chrome-aws-lambda](https://github.com/Sparticuz/chrome-aws-lambda#versioning) or [@alixaxel/chrome-aws-lambda](https://github.com/alixaxel/chrome-aws-lambda).
## Versioning
The @sparticuz/chromium version schema is as follows:
`MajorChromiumVersion.MinorChromiumIncrement.@Sparticuz/chromiumPatchLevel`
-Because this package follows Chromium's releases, it does NOT follow semantic versioning. **Breaking changes can occur with the 'patch' level.** Please check the release notes for information on breaking changes.
+Because this package follows Chromium's release cycle, it does NOT follow semantic versioning. **Breaking changes may occur at the 'patch' level.** Please check the release notes for details on breaking changes.
## Usage
-This package works with all the currently supported AWS Lambda Node.js runtimes out of the box.
+This package works with all currently supported AWS Lambda Node.js runtimes out of the box.
```javascript
const test = require("node:test");
const puppeteer = require("puppeteer-core");
const chromium = require("@sparticuz/chromium");
-// Optional: If you'd like to use the legacy headless mode. "new" is the default.
-chromium.setHeadlessMode = true;
-
// Optional: If you'd like to disable webgl, true is the default.
chromium.setGraphicsMode = false;
-// Optional: Load any fonts you need. Open Sans is included by default in AWS Lambda instances
+// Optional: Load any fonts you need.
await chromium.font(
"https://raw.githack.com/googlei18n/noto-emoji/master/fonts/NotoColorEmoji.ttf"
);
test("Check the page title of example.com", async (t) => {
+ const viewport = {
+ deviceScaleFactor: 1,
+ hasTouch: false,
+ height: 1080,
+ isLandscape: true,
+ isMobile: false,
+ width: 1920,
+ };
const browser = await puppeteer.launch({
- args: chromium.args,
- defaultViewport: chromium.defaultViewport,
+ args: puppeteer.defaultArgs({ args: chromium.args, headless: "shell" }),
+ defaultViewport: viewport,
executablePath: await chromium.executablePath(),
- headless: chromium.headless,
+ headless: "shell",
});
const page = await browser.newPage();
@@ -74,7 +81,7 @@ test("Check the page title of example.com", async (t) => {
const pageTitle = await page.title();
await browser.close();
- assert.strictEqual(pageTitle, "Example Domain");
+ t.assert.strictEqual(pageTitle, "Example Domain");
});
```
@@ -88,9 +95,9 @@ const chromium = require("@sparticuz/chromium");
test("Check the page title of example.com", async (t) => {
const browser = await playwright.launch({
- args: chromium.args,
+ args: chromium.args, // Playwright merges the args
executablePath: await chromium.executablePath(),
- headless: chromium.headless,
+ // headless: true, /* true is the default */
});
const context = await browser.newContext();
@@ -99,112 +106,202 @@ test("Check the page title of example.com", async (t) => {
const pageTitle = await page.title();
await browser.close();
- assert.strictEqual(pageTitle, "Example Domain");
+ t.assert.strictEqual(pageTitle, "Example Domain");
});
```
-You should allocate at least 512 MB of RAM to your instance, however 1600 MB (or more) is recommended.
+You should allocate at least 512 MB of RAM to your instance; however, 1600 MB (or more) is recommended.
-### -min package
+### -min Package
-The -min package DOES NOT include the chromium brotli files. There are a few instances where this is useful. Primarily, this is useful when your host has file size limits.
+The -min package does NOT include the Chromium Brotli files. This is useful when your host has file size limits.
-To use the -min package please install the `@sparticuz/chromium-min` package.
+To use the -min package, install the `@sparticuz/chromium-min` package instead of `@sparticuz/chromium`
-When using the -min package, you need to specify the location of the brotli files.
+When using the -min package, you must specify the location of the Brotli files.
-In this example, /opt/chromium contains all the brotli files
+In this example, `/opt/chromium` contains all the Brotli files:
```
/opt
/chromium
- /aws.tar.br
+ /al2023.tar.br
/chromium.br
+ /fonts.tar.br
/swiftshader.tar.br
```
```javascript
+const viewport = {
+ deviceScaleFactor: 1,
+ hasTouch: false,
+ height: 1080,
+ isLandscape: true,
+ isMobile: false,
+ width: 1920,
+};
const browser = await puppeteer.launch({
- args: chromium.args,
- defaultViewport: chromium.defaultViewport,
+ args: puppeteer.defaultArgs({ args: chromium.args, headless: "shell" }),
+ defaultViewport: viewport,
executablePath: await chromium.executablePath("/opt/chromium"),
- headless: chromium.headless,
+ headless: "shell",
});
```
-In the following example, https://www.example.com/chromiumPack.tar contains all the brotli files. Generally, this would be a location on S3, or another very fast downloadable location, that is in close proximity to your function's execution location.
+In the following example, `https://www.example.com/chromiumPack.tar` contains all the Brotli files. Generally, this would be a location on S3 or another very fast downloadable location that is close to your function's execution environment.
-On the initial iteration, `@sparticuz/chromium` will download the pack tar file, untar the files to `/tmp/chromium-pack`, then will un-brotli the `chromium` binary to `/tmp/chromium`. The following iterations will see that `/tmp/chromium` exists and will use the already downloaded files.
+On the first run, `@sparticuz/chromium` will download the pack tar file, untar the files to `/tmp/chromium-pack`, and then decompress the `chromium` binary to `/tmp/chromium`. Subsequent runs (during a warm start) will detect that `/tmp/chromium` exists and use the already downloaded files.
-The latest chromium-pack.tar file will be on the latest [release](https://github.com/Sparticuz/chromium/releases).
+The latest `chromium-pack.arch.tar` file is available in the latest [release](https://github.com/Sparticuz/chromium/releases).
```javascript
+const viewport = {
+ deviceScaleFactor: 1,
+ hasTouch: false,
+ height: 1080,
+ isLandscape: true,
+ isMobile: false,
+ width: 1920,
+};
const browser = await puppeteer.launch({
- args: chromium.args,
- defaultViewport: chromium.defaultViewport,
+ args: puppeteer.defaultArgs({ args: chromium.args, headless: "shell" }),
+ defaultViewport: viewport,
executablePath: await chromium.executablePath(
"https://www.example.com/chromiumPack.tar"
),
- headless: chromium.headless,
+ headless: "shell",
});
```
### Examples
-Here are some example projects and help with other services
+Here are some example projects and guides for other services:
- [Production Dependency](https://github.com/Sparticuz/chromium/tree/master/examples/production-dependency)
- [Serverless Framework with Lambda Layer](https://github.com/Sparticuz/chromium/tree/master/examples/serverless-with-lambda-layer)
- [Serverless Framework with Pre-existing Lambda Layer](https://github.com/Sparticuz/chromium/tree/master/examples/serverless-with-preexisting-lambda-layer)
- [Chromium-min](https://github.com/Sparticuz/chromium/tree/master/examples/remote-min-binary)
-- AWS SAM _TODO_
+- [AWS SAM](https://github.com/Sparticuz/chromium/tree/master/examples/aws-sam)
- [Webpack](https://github.com/Sparticuz/chromium/issues/24#issuecomment-1343196897)
- [Netlify](https://github.com/Sparticuz/chromium/issues/24#issuecomment-1414107620)
-### Running Locally & Headless/Headful mode
+### Running Locally & Headless/Headful Mode
-This version of `chromium` is built using the `headless.gn` build variables, which does not appear to even include a GUI. [Also, at this point, AWS Lambda 2 does not support a modern version of `glibc`](https://github.com/aws/aws-lambda-base-images/issues/59), so this package does not include an ARM version yet, which means it will not work on any M Series Apple products. If you need to test your code using a headful or ARM version, please use your locally installed version of `chromium/chrome`, or you may use the `puppeteer` provided version.
+This version of Chromium is built using the `headless.gn` build variables, which do not include a GUI. If you need to test your code using a headful instance, use your locally installed version of Chromium/Chrome, or the version provided by Puppeteer.
```shell
npx @puppeteer/browsers install chromium@latest --path /tmp/localChromium
```
-For more information on installing a specific version of `chromium`, checkout [@puppeteer/browsers](https://www.npmjs.com/package/@puppeteer/browsers).
+For more information on installing a specific version of `chromium`, check out [@puppeteer/browsers](https://www.npmjs.com/package/@puppeteer/browsers).
-For example, you can set your code to use an ENV variable such as `IS_LOCAL`, then use if/else statements to direct puppeteer to the correct environment.
+For example, you can set your code to use an environment variable such as `IS_LOCAL`, then use if/else statements to direct Puppeteer to the correct environment.
```javascript
+const viewport = {
+ deviceScaleFactor: 1,
+ hasTouch: false,
+ height: 1080,
+ isLandscape: true,
+ isMobile: false,
+ width: 1920,
+};
+const headlessType = process.env.IS_LOCAL ? false : "shell";
const browser = await puppeteer.launch({
- args: process.env.IS_LOCAL ? puppeteer.defaultArgs() : chromium.args,
- defaultViewport: chromium.defaultViewport,
+ args: process.env.IS_LOCAL
+ ? puppeteer.defaultArgs()
+ : puppeteer.defaultArgs({ args: chromium.args, headless: headlessType }),
+ defaultViewport: viewport,
executablePath: process.env.IS_LOCAL
? "/tmp/localChromium/chromium/linux-1122391/chrome-linux/chrome"
: await chromium.executablePath(),
- headless: process.env.IS_LOCAL ? false : chromium.headless,
+ headless: headlessType,
});
```
-## Frequently asked questions
+## Frequently Asked Questions
### Can I use ARM or Graviton instances?
-Amazon's default Lambda base image is quite old at this point and does not support newer versions of `glibc` that chromium requires. When Amazon Linux 2023 comes to Lambda as the default base image, ARM support should be possible. Ref: https://github.com/Sparticuz/chrome-aws-lambda/pull/11, https://github.com/aws/aws-lambda-base-images/issues/59
+YES! Starting at Chromium v135, @sparticuz/chromium includes an arm64 pack.
+
+### Can I use Google Chrome or Chrome for Testing? What is chrome-headless-shell?
-### Can I use Google Chrome or Chrome for Testing, what is headless_shell?
+`headless_shell` is a purpose-built version of Chromium specifically for headless purposes. It does not include a GUI and only works via remote debugging connection. This is what this package is built on.
-`headless_shell` is a purpose built version of `chromium` specific for headless purposes. It does not include the GUI at all and only works via remote debugging connection. Ref: https://chromium.googlesource.com/chromium/src/+/lkgr/headless/README.md, https://source.chromium.org/chromium/chromium/src/+/main:headless/app/headless_shell.cc
+- https://chromium.googlesource.com/chromium/src/+/lkgr/headless/README.md
+- https://source.chromium.org/chromium/chromium/src/+/main:headless/app/headless_shell.cc
+- https://developer.chrome.com/blog/chrome-headless-shell
### Can I use the "new" Headless mode?
From what I can tell, `headless_shell` does not seem to include support for the "new" headless mode.
-### It doesn't work with Webpack!?!
+### It doesn't work with Webpack!?
+
+Try marking this package as an external dependency.
+
+- https://webpack.js.org/configuration/externals/
+
+### I'm experiencing timeouts or failures closing Chromium
+
+This is a common issue. Chromium sometimes opens more pages than you expect. You can try the following:
+
+```typescript
+for (const page of await browser.pages()) {
+ await page.close();
+}
+await browser.close();
+```
+
+You can also try the following if one of the calls is hanging for some reason:
+
+```typescript
+await Promise.race([browser.close(), browser.close(), browser.close()]);
+```
+
+Always `await browser.close()`, even if your script is returning an error.
+
+### `BrowserContext` isn't working properly (Target.closed)
+
+You may not be able to create a new context. You can try to use the default context as seen in this patch: https://github.com/Sparticuz/chromium/issues/298
+
+### Do I need to use @sparticuz/chromium?
-Try marking this package as an external. Ref: https://webpack.js.org/configuration/externals/
+This package is designed to be run on a vanilla Lambda instance. If you are using a Dockerfile to publish your code to Lambda, it may be better to install Chromium and its dependencies from the distribution's repositories.
+
+### I need accessible PDF files
+
+This is due to the way @sparticuz/chromium is built. If you require accessible PDFs, you'll need to
+recompile Chromium yourself with the following patch. You can then use that binary with @sparticuz/chromium-min.
+
+_Note_: This will increase the time required to generate a PDF.
+
+```patch
+diff --git a/_/ansible/plays/chromium.yml b/_/ansible/plays/chromium.yml
+index b42c740..49111d7 100644
+--- a/_/ansible/plays/chromium.yml
++++ b/_/ansible/plays/chromium.yml
+@@ -249,8 +249,9 @@
+ blink_symbol_level = 0
+ dcheck_always_on = false
+ disable_histogram_support = false
+- enable_basic_print_dialog = false
+ enable_basic_printing = true
++ enable_pdf = true
++ enable_tagged_pdf = true
+ enable_keystone_registration_framework = false
+ enable_linux_installer = false
+ enable_media_remoting = false
+```
+
+### Can I use a language other than Javascript (NodeJS)?
+Yes, you will need to write your own Brotli extraction algorithm and args inclusion. (Basically, rewrite the typescript files). The binaries, once extracted, will work with any language.
+- C Sharp: https://github.com/Podginator/lambda-chromium-playwright-CSharp/tree/main
## Fonts
-The Amazon Linux 2 AWS Lambda runtime is not provisioned with any font faces.
+The AWS Lambda runtime is not provisioned with any font faces.
Because of this, this package ships with [Open Sans](https://fonts.google.com/specimen/Open+Sans), which supports the following scripts:
@@ -212,7 +309,7 @@ Because of this, this package ships with [Open Sans](https://fonts.google.com/sp
- Greek
- Cyrillic
-To provision additional fonts, simply call the `font()` method with an absolute path or URL:
+To provision additional fonts, call the `font()` method with an absolute path or URL:
```typescript
await chromium.font("/var/task/fonts/NotoColorEmoji.ttf");
@@ -224,15 +321,15 @@ await chromium.font(
> `Noto Color Emoji` (or similar) is needed if you want to [render emojis](https://getemoji.com/).
-> For URLs, it's recommended that you use a CDN, like [raw.githack.com](https://raw.githack.com/) or [gitcdn.xyz](https://gitcdn.xyz/).
+> For URLs, it's recommended that you use a CDN, such as [raw.githack.com](https://raw.githack.com/) or [gitcdn.xyz](https://gitcdn.xyz/).
This method should be invoked _before_ launching Chromium.
---
-Alternatively, it's also possible to provision fonts via AWS Lambda Layers.
+Alternatively, you can also provision fonts via AWS Lambda Layers.
-Simply create a directory named `.fonts` and place any font faces you want there:
+Create a directory named `.fonts` or `fonts` and place any font faces you want there:
```
.fonts
@@ -240,15 +337,22 @@ Simply create a directory named `.fonts` and place any font faces you want there
└── Roboto.ttf
```
-Afterwards, you just need to ZIP the directory and upload it as a AWS Lambda Layer:
+Afterwards, zip the directory and upload it as an AWS Lambda Layer:
```shell
-zip -9 --filesync --move --recurse-paths .fonts.zip .fonts/
+zip -9 --filesync --move --recurse-paths fonts.zip fonts/
```
+Font directories are specified inside the `fonts.conf` file found inside the `bin/fonts.tar.br` file. These are the default folders:
+
+- `/var/task/.fonts`
+- `/var/task/fonts`
+- `/opt/fonts`
+- `/tmp/fonts`
+
## Graphics
-By default, this package uses `swiftshader`/`angle` to do CPU acceleration for WebGL. This is the only known way to enable WebGL on a serverless platform. You can disable WebGL by setting `chromium.setGraphiceMode = false;` _before_ launching Chromium. Disabling this will also skip the extract of the `bin/swiftshader.tar.br` file, which saves about a second of initial execution time. Disabling graphics is recommended if you know you are not using any WebGL.
+By default, this package uses `swiftshader`/`angle` to do CPU acceleration for WebGL. This is the only known way to enable WebGL on a serverless platform. You can disable WebGL by setting `chromium.setGraphicsMode = false;` _before_ launching Chromium. Chromium still requires extracting the `bin/swiftshader.tar.br` file in order to launch. Testing is needed to determine if there is any positive speed impact from disabling WebGL.
## API
@@ -256,91 +360,69 @@ By default, this package uses `swiftshader`/`angle` to do CPU acceleration for W
| ----------------------------------- | ----------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `font(url)` | `Promise<string>` | Provisions a custom font and returns its basename. |
| `args` | `Array<string>` | Provides a list of recommended additional [Chromium flags](https://github.com/GoogleChrome/chrome-launcher/blob/master/docs/chrome-flags-for-tools.md). |
-| `defaultViewport` | `Object` | Returns a sensible default viewport for serverless. |
-| `executablePath(location?: string)` | `Promise<string>` | Returns the path the Chromium binary was extracted to. |
-| `setHeadlessMode` | `void` | Sets the headless mode to either `true` or `"new"` |
-| `headless` | `true \| "new"` | Returns `true` or `"new"` depending on what version of chrome's headless you are running |
-| `setGraphicsMode` | `void` | Sets the graphics mode to either `true` or `false` |
-| `graphics` | `boolean` | Returns a boolean depending on whether webgl is enabled or disabled |
+| `executablePath(location?: string)` | `Promise<string>` | Returns the path where the Chromium binary was extracted. |
+| `setGraphicsMode` | `void` | Sets the graphics mode to either `true` or `false`. |
+| `graphics` | `boolean` | Returns a boolean indicating whether WebGL is enabled or disabled. |
+
+## Extra Args documentation
+
+- [Comparisons](https://docs.google.com/spreadsheets/d/1n-vw_PCPS45jX3Jt9jQaAhFqBY6Ge1vWF_Pa0k7dCk4)
+- [Puppeteer Default Args](https://github.com/puppeteer/puppeteer/blob/729c160cba596a9b7b505abd4be99cba1af2e1f3/packages/puppeteer-core/src/node/ChromeLauncher.ts#L156)
+- [Playwright Default Args](https://github.com/microsoft/playwright/blob/ed23a935121687d246cb61f4146b50a7972864d9/packages/playwright-core/src/server/chromium/chromium.ts#L276)
+
+## Contributing
-## Compiling
+### Updating the binaries
-To compile your own version of Chromium check the [Ansible playbook instructions](_/ansible).
+> **Note:** For security reasons, we do not accept PRs that include updated binary files. Please submit the changes to build files only, and the maintainers will compile and update the binary files.
+
+1. Run `npm run update` to update [inventory.ini](_/ansible/inventory.ini) with the latest stable version of Chromium.
+2. Make any necessary changes to the [build-arch.yml](_/ansible/plays/build-arch.yml) file.
+3. Make any necessary changes to [inventory.ini](_/ansible/inventory.ini).
+4. Run the appropriate command from the [Makefile](_/ansible/Makefile). Use `make build` to compile both x64 and arm64 versions.
+5. If compiling both architectures and [al2023.tar.br](bin/x64/al2023.tar.br) has been modified, update the arm64 version by running `make build-arm-libs`.
+6. Verify that the `chromium-###.#.#.#.br` files are valid.
+7. Rename them to `chromium.br`.
+8. If necessary, update the Open Sans font using `npm run build:fonts`.
+9. Run tests on the new version of Chromium. (`npm run test:source` and `npm run test:integration`). Integration tests requires AWS SAM cli and docker installed.
+
+### Updating Typescript application code
+
+1. Edit any of the source files in [source](source/).
+2. Create or update tests in [tests](tests/).
+3. Lint the package using `npm run lint`.
+4. Build the package using `npm run build`.
+5. Test the updates using `npm run test:source`.
+6. Run a full integration test using `npm run test:integration`. This requires AWS SAM cli and docker installed.
## AWS Lambda Layer
-[Lambda Layers](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html) is a convenient way to manage common dependencies between different Lambda Functions.
+[Lambda Layers](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html) are a convenient way to manage common dependencies between different Lambda Functions.
The following set of (Linux) commands will create a layer of this package:
```shell
+archType="x64" && \
git clone --depth=1 https://github.com/sparticuz/chromium.git && \
cd chromium && \
-make chromium.zip
+make chromium.${archType}$.zip
```
-The above will create a `chromium.zip` file, which can be uploaded to your Layers console. You can and should upload using the `aws cli`. (Replace the variables with your own values)
+The above will create a `chromium.x64.zip` file, which can be uploaded to your Layers console. If you are using `arm64`, replace the value accordingly. You can and should upload using the `aws cli`. (Replace the variables with your own values.)
```shell
-bucketName="chromiumUploadBucket" && \
-versionNumber="107" && \
-aws s3 cp chromium.zip "s3://${bucketName}/chromiumLayers/chromium${versionNumber}.zip" && \
-aws lambda publish-layer-version --layer-name chromium --description "Chromium v${versionNumber}" --content "S3Bucket=${bucketName},S3Key=chromiumLayers/chromium${versionNumber}.zip" --compatible-runtimes nodejs --compatible-architectures x86_64
+bucketName="chromiumUploadBucket" && archType="x64" && versionNumber="v135.0.0" && \
+aws s3 cp chromium.${archType}.zip "s3://${bucketName}/chromiumLayers/chromium-${versionNumber}-layer.${archType}.zip" && \
+aws lambda publish-layer-version --layer-name chromium --description "Chromium v${versionNumber} for ${archType}" --content "S3Bucket=${bucketName},S3Key=chromiumLayers/chromium-${versionNumber}-layer.${archType}.zip" --compatible-runtimes "nodejs20.x" "nodejs22.x" --compatible-architectures $(if [ "$archType" = "x64" ]; then echo "x86_64"; else echo "$archType"; fi)
```
Alternatively, you can also download the layer artifact from one of our [releases](https://github.com/Sparticuz/chromium/releases).
-According to our benchmarks, it's 40% to 50% faster than using the off-the-shelf `puppeteer` bundle.
-
-## Migration from `chrome-aws-lambda`
-
-- Change the import or require to be `@sparticuz/chromium`
-- Add the import or require for `puppeteer-core`
-- Change the browser launch to use the native `puppeteer.launch()` function
-- Change the `executablePath` to be a function.
-
-```diff
--const chromium = require('@sparticuz/chrome-aws-lambda');
-+const chromium = require("@sparticuz/chromium");
-+const puppeteer = require("puppeteer-core");
-
-exports.handler = async (event, context, callback) => {
- let result = null;
- let browser = null;
-
- try {
-- browser = await chromium.puppeteer.launch({
-+ browser = await puppeteer.launch({
- args: chromium.args,
- defaultViewport: chromium.defaultViewport,
-- executablePath: await chromium.executablePath,
-+ executablePath: await chromium.executablePath(),
- headless: chromium.headless,
- ignoreHTTPSErrors: true,
- });
-
- let page = await browser.newPage();
-
- await page.goto(event.url || 'https://example.com');
-
- result = await page.title();
- } catch (error) {
- return callback(error);
- } finally {
- if (browser !== null) {
- await browser.close();
- }
- }
-
- return callback(null, result);
-};
-```
-
## Compression
The Chromium binary is compressed using the Brotli algorithm.
-This allows us to get the best compression ratio and faster decompression times.
+This provides the best compression ratio and faster decompression times.
| File | Algorithm | Level | Bytes | MiB | % | Inflation |
| ------------- | --------- | ----- | --------- | --------- | ---------- | ---------- |
@@ -372,6 +454,18 @@ This allows us to get the best compression ratio and faster decompression times.
| `chromium.br` | Brotli | 10 | 36090087 | 34.42 | 73.65% | 0.765s |
| `chromium.br` | Brotli | 11 | 34820408 | **33.21** | **74.58%** | 0.712s |
+## Backers
+
+If you or your organization have benefited financially from this package, please consider supporting.
+
+Thank you to the following users and companies for your support!
+
+[](https://github.com/BeriMedia)
+[](https://github.com/Qvalia)
+[](https://github.com/munawwar)
+[](https://github.com/syntaxfm)
+[](https://github.com/th3madhack3r)
+
## License
MIT
diff --git _/amazon/events/example.com.json _/amazon/events/example.com.json
index c60d0e6..4751ba3 100644
--- _/amazon/events/example.com.json
+++ _/amazon/events/example.com.json
@@ -3,21 +3,14 @@
"url": "https://example.com",
"expected": {
"title": "Example Domain",
- "screenshot": "fdd55bf210cb00e00cadf3098055611d11293d02"
- }
- },
- {
- "url": "https://example.com",
- "expected": {
- "title": "Example Domain",
- "screenshot": "fdd55bf210cb00e00cadf3098055611d11293d02"
+ "screenshot": "5b4042aa3f20574b0b408e4c22d65255004d7d2ac1f69e96021649570c74bb36"
}
},
{
"url": "https://get.webgl.org",
"expected": {
"remove": "logo-container",
- "screenshot": "7a63a9a18f32dcdad78e1e0a03364fade25c85a8"
+ "screenshot": "1023e4f59fddb99d184847ca3711e79c06c04587aa7eacbf4ad6e97c7f52125d"
}
}
]
diff --git _/amazon/handlers/index.js _/amazon/handlers/index.js
deleted file mode 100644
index f41b3dc..0000000
--- _/amazon/handlers/index.js
+++ /dev/null
@@ -1,68 +0,0 @@
-const { ok } = require("assert");
-const { createHash } = require("crypto");
-const puppeteer = require("puppeteer-core");
-const chromium = require("@sparticuz/chromium");
-
-exports.handler = async (event, context) => {
- let browser = null;
-
- try {
- browser = await puppeteer.launch({
- args: chromium.args,
- defaultViewport: chromium.defaultViewport,
- executablePath: await chromium.executablePath(),
- headless: chromium.headless,
- ignoreHTTPSErrors: true,
- });
-
- console.log("Chromium verion", await browser.version());
-
- const contexts = [browser.defaultBrowserContext()];
-
- while (contexts.length < event.length) {
- contexts.push(await browser.createIncognitoBrowserContext());
- }
-
- for (let context of contexts) {
- const job = event.shift();
- const page = await context.newPage();
-
- if (job.hasOwnProperty("url") === true) {
- await page.goto(job.url, { waitUntil: ["domcontentloaded", "load"] });
-
- if (job.hasOwnProperty("expected") === true) {
- if (job.expected.hasOwnProperty("title") === true) {
- ok(
- (await page.title()) === job.expected.title,
- `Title assertion failed.`
- );
- }
-
- if (job.expected.hasOwnProperty("screenshot") === true) {
- if (job.expected.hasOwnProperty("remove") === true) {
- await page.evaluate((selector) => {
- document.getElementById(selector).remove();
- }, job.expected.remove);
- }
- const screenshot = await page.screenshot();
- // console.log(screenshot.toString('base64'), createHash('sha1').update(screenshot.toString('base64')).digest('hex'));
- ok(
- createHash("sha1")
- .update(screenshot.toString("base64"))
- .digest("hex") === job.expected.screenshot,
- `Screenshot assertion failed.`
- );
- }
- }
- }
- }
- } catch (error) {
- throw error.message;
- } finally {
- if (browser !== null) {
- await browser.close();
- }
- }
-
- return true;
-};
diff --git a/_/amazon/handlers/index.mjs b/_/amazon/handlers/index.mjs
new file mode 100644
index 0000000..b5e7da0
--- /dev/null
+++ _/amazon/handlers/index.mjs
@@ -0,0 +1,90 @@
+/* eslint-disable sonarjs/no-commented-code */
+import chromium from "@sparticuz/chromium";
+import { ok } from "node:assert";
+import { createHash } from "node:crypto";
+import puppeteer from "puppeteer-core";
+
+export const handler = async (
+ /** @type {{url: string; expected: {title: string; remove: string; screenshot: string}}[]} */ event
+ // eslint-disable-next-line sonarjs/cognitive-complexity
+) => {
+ let browser = null;
+
+ try {
+ browser = await puppeteer.launch({
+ args: puppeteer.defaultArgs({
+ // Add in more args for serverless environments
+ args: chromium.args,
+ headless: "shell",
+ }),
+ defaultViewport: {
+ deviceScaleFactor: 1,
+ hasTouch: false,
+ height: 1080,
+ isLandscape: true,
+ isMobile: false,
+ width: 1920,
+ },
+ executablePath: await chromium.executablePath(),
+ headless: "shell",
+ });
+
+ console.log("Chromium version", await browser.version());
+
+ for (let job of event) {
+ const page = await browser.newPage();
+
+ if (Object.prototype.hasOwnProperty.call(job, "url") === true) {
+ await page.goto(job.url, { waitUntil: ["domcontentloaded", "load"] });
+
+ if (Object.prototype.hasOwnProperty.call(job, "expected") === true) {
+ if (
+ Object.prototype.hasOwnProperty.call(job.expected, "title") === true
+ ) {
+ ok(
+ (await page.title()) === job.expected.title,
+ `Title assertion failed.`
+ );
+ }
+
+ if (
+ Object.prototype.hasOwnProperty.call(job.expected, "screenshot") ===
+ true
+ ) {
+ if (
+ Object.prototype.hasOwnProperty.call(job.expected, "remove") ===
+ true
+ ) {
+ await page.evaluate((selector) => {
+ // eslint-disable-next-line unicorn/prefer-query-selector
+ document.getElementById(selector)?.remove();
+ }, job.expected.remove);
+ }
+ const screenshot = Buffer.from(await page.screenshot());
+ const base64 = `data:image/png;base64,${screenshot.toString(
+ "base64"
+ )}`;
+ const hash = createHash("sha256").update(base64).digest("hex");
+ // console.log(base64, hash);
+ ok(
+ hash === job.expected.screenshot,
+ `Screenshot assertion failed.`
+ );
+ }
+ }
+ }
+ }
+ } catch (error) {
+ // @ts-expect-error It's an error
+ throw error.message;
+ } finally {
+ if (browser !== null) {
+ for (const page of await browser.pages()) {
+ await page.close();
+ }
+ await browser.close();
+ }
+ }
+
+ return true;
+};
diff --git _/amazon/template.yml _/amazon/template.yml
index 046f903..b43b38d 100644
--- _/amazon/template.yml
+++ _/amazon/template.yml
@@ -12,27 +12,45 @@ Resources:
LayerName: sparticuz-chromium
ContentUri: code/
CompatibleRuntimes:
- - nodejs16.x
- - nodejs18.x
+ - nodejs20.x
+ - nodejs22.x
+ - nodejs24.x
- node16:
+ node20:
Type: AWS::Serverless::Function
Properties:
+ Architectures:
+ - x86_64
Layers:
- !Ref layer
Handler: handlers/index.handler
- Runtime: nodejs16.x
+ Runtime: nodejs20.x
Policies:
- AWSLambdaBasicExecutionRole
- AWSXRayDaemonWriteAccess
Tracing: Active
- node18:
+ node22:
Type: AWS::Serverless::Function
Properties:
+ Architectures:
+ - x86_64
Layers:
- !Ref layer
Handler: handlers/index.handler
- Runtime: nodejs18.x
+ Runtime: nodejs22.x
+ Policies:
+ - AWSLambdaBasicExecutionRole
+ - AWSXRayDaemonWriteAccess
+ Tracing: Active
+ node24:
+ Type: AWS::Serverless::Function
+ Properties:
+ Architectures:
+ - x86_64
+ Layers:
+ - !Ref layer
+ Handler: handlers/index.handler
+ Runtime: nodejs24.x
Policies:
- AWSLambdaBasicExecutionRole
- AWSXRayDaemonWriteAccess
diff --git _/ansible/Makefile _/ansible/Makefile
index 323006a..55ac36e 100644
--- _/ansible/Makefile
+++ _/ansible/Makefile
@@ -1,9 +1,18 @@
.PHONY: ansible chromium
dependencies:
- sudo apt install python3-pip zip
- pip install ansible boto boto3 aws-sam-cli
- echo "Docker is also required in order to test the package, please install docker or Docker Desktop"
+ sudo pacman -S ansible docker python-boto3 zip
+ echo "aws-sam-cli-bin is also required in order to test the package"
-chromium:
- ansible-playbook plays/chromium.yml -i inventory.ini
+build:
+ echo "Building both architectures takes roughly 5 hours"
+ /usr/bin/ansible-playbook plays/chromium.yml -i inventory.ini
+
+build-x64:
+ /usr/bin/ansible-playbook plays/chromium.yml -i inventory.ini --extra-vars 'archs=["x64"]'
+
+build-arm64:
+ /usr/bin/ansible-playbook plays/chromium.yml -i inventory.ini --extra-vars 'archs=["arm64"]'
+
+build-arm-libs:
+ /usr/bin/ansible-playbook plays/arm-libs.yml -i inventory.ini
diff --git a/_/ansible/ansible.cfg b/_/ansible/ansible.cfg
old mode 100755
new mode 100644
diff --git a/_/ansible/inventory.ini b/_/ansible/inventory.ini
diff --git _/ansible/inventory.ini _/ansible/inventory.ini
index 6fded90..c59fb8c 100644
--- _/ansible/inventory.ini
+++ _/ansible/inventory.ini
@@ -4,9 +4,11 @@
[localhost:vars]
ansible_connection=local
ansible_python_interpreter=python
-image=ami-08d090f841c8435e9
-region=us-east-1
-instance_size=c7i.12xlarge
+aws_region=us-east-1
+# The instance type for x64 must include NVME attached storage
+x64_instance=i4i.8xlarge
+# The arm64 instance type is only to download the arm64 lib files
+arm64_instance=m8g.medium
[aws]
@@ -14,4 +16,5 @@ instance_size=c7i.12xlarge
ansible_connection=ssh
ansible_python_interpreter=auto_silent
ansible_ssh_private_key_file=ansible.pem
-chromium_revision=1181205
+chromium_revision=1536371
+archs=["x64", "arm64"]
diff --git a/_/ansible/plays/arm-libs.yml b/_/ansible/plays/arm-libs.yml
new file mode 100644
index 0000000..3c8869c
--- /dev/null
+++ _/ansible/plays/arm-libs.yml
@@ -0,0 +1,213 @@
+---
+- name: Bootstrap AWS
+ hosts: localhost
+ gather_facts: false
+
+ tasks:
+ - name: Creating SSH Key
+ shell: |
+ ssh-keygen -b 2048 -t rsa -f ansible.pem -q -N '' && \
+ chmod 0600 ansible.pem.pub
+ args:
+ chdir: ..
+ creates: ansible.pem
+
+ - name: Creating EC2 Key Pair
+ amazon.aws.ec2_key:
+ name: ansible
+ state: present
+ region: "{{ aws_region }}"
+ key_material: "{{ item }}"
+ with_file: ../ansible.pem.pub
+
+ - name: Creating Security Group
+ amazon.aws.ec2_group:
+ name: Chromium
+ description: SSH Access
+ state: present
+ region: "{{ aws_region }}"
+ rules:
+ - proto: tcp
+ to_port: 22
+ from_port: 22
+ cidr_ip: 0.0.0.0/0
+ rules_egress:
+ - proto: all
+ cidr_ip: 0.0.0.0/0
+
+ - name: Request EC2 Instance
+ amazon.aws.ec2_instance:
+ count: 1
+ ebs_optimized: yes
+ image:
+ id: "{{ lookup('amazon.aws.ssm_parameter', '/aws/service/ami-amazon-linux-latest/al2023-ami-kernel-6.1-arm64') }}"
+ instance_initiated_shutdown_behavior: terminate
+ instance_type: "{{ arm64_instance }}"
+ key_name: ansible
+ network_interfaces:
+ - assign_public_ip: yes
+ groups: Chromium
+ region: "{{ aws_region }}"
+ security_group: Chromium
+ state: present
+ tags:
+ Name: Chromium
+ volumes:
+ - device_name: /dev/xvda
+ ebs:
+ delete_on_termination: true
+ volume_type: io2
+ volume_size: 256
+ iops: 3000
+ register: ec2
+
+ - name: Registering Host
+ add_host:
+ hostname: "{{ ec2.instances[0].public_ip_address }}"
+ groupname: aws
+
+ - name: Waiting for SSH
+ wait_for:
+ host: "{{ ec2.instances[0].public_ip_address }}"
+ port: 22
+ timeout: 320
+ state: started
+
+- name: AWS
+ user: ec2-user
+ hosts: aws
+ gather_facts: true
+ environment:
+ LANG: en_US.UTF-8
+ LC_ALL: en_US.UTF-8
+ PATH: "{{ ansible_env.PATH }}"
+
+ tasks:
+ - name: Update system
+ become: true
+ become_user: root
+ shell: |
+ dnf update -y
+
+ - name: Installing Packages
+ become: true
+ become_user: root
+ dnf:
+ name:
+ - "@Development Tools"
+ - alsa-lib-devel
+ - atk-devel
+ - bc
+ - bluez-libs-devel
+ - bzip2-devel
+ - cairo-devel
+ - cmake
+ - cups-devel
+ - dbus-devel
+ - dbus-glib-devel
+ - dbus-x11
+ - expat-devel
+ - glibc
+ - glibc-langpack-en
+ - gperf
+ - gtk3-devel
+ - httpd
+ - java-17-amazon-corretto
+ - libatomic
+ - libcap-devel
+ - libjpeg-devel
+ - libstdc++
+ - libXScrnSaver-devel
+ - libxkbcommon-x11-devel
+ - mod_ssl
+ - ncurses-compat-libs
+ - nspr-devel
+ - nss-devel
+ - pam-devel
+ - pciutils-devel
+ - perl
+ - php
+ - php-cli
+ - pulseaudio-libs-devel
+ - python
+ - python-psutil
+ - python-setuptools
+ - ruby
+ - xorg-x11-server-Xvfb
+ - zlib
+ state: latest
+ update_cache: true
+
+ - name: Checking for Directory Structure
+ stat:
+ path: /srv/source/chromium
+ register: structure
+
+ - name: Creating Directory Structure
+ become: true
+ become_user: root
+ file:
+ path: /srv/{{ item }}/chromium
+ state: directory
+ group: ec2-user
+ owner: ec2-user
+ recurse: true
+ with_items:
+ - lib
+ when: structure.stat.exists != true
+
+ - name: Creating AL2023 Package
+ shell: |
+ tar --directory /usr/lib64 --create --file al2023.tar \
+ --transform='s,^libexpat\.so\.1\.9\.3$,libexpat.so.1,' \
+ --transform='s,^,lib/,' \
+ libexpat.so.1.9.3 libfreebl3.so libfreeblpriv3.so libnspr4.so libnss3.so libnssutil3.so libplc4.so libplds4.so libsoftokn3.so libfreebl3.chk libfreeblpriv3.chk libsoftokn3.chk
+ args:
+ chdir: /srv/lib
+ creates: /srv/lib/al2023.tar
+
+ - name: Compressing AL2023 Package
+ shell: |
+ brotli --best --force al2023.tar
+ args:
+ chdir: /srv/lib
+ creates: /srv/lib/al2023.tar.br
+
+ - name: Downloading AL2023 Package
+ fetch:
+ src: /srv/lib/al2023.tar.br
+ dest: ../../../bin/arm64/
+ flat: yes
+ fail_on_missing: true
+
+- name: Teardown AWS
+ hosts: localhost
+ gather_facts: false
+
+ tasks:
+ - name: Terminating EC2 Instance
+ amazon.aws.ec2_instance:
+ wait: yes
+ state: absent
+ instance_ids: "{{ ec2.instance_ids }}"
+ region: "{{ aws_region }}"
+
+ - name: Deleting Security Group
+ amazon.aws.ec2_group:
+ name: Chromium
+ state: absent
+ region: "{{ aws_region }}"
+
+ - name: Deleting EC2 Key Pair
+ amazon.aws.ec2_key:
+ name: ansible
+ state: absent
+ region: "{{ aws_region }}"
+
+ - name: Deleting SSH Key
+ file:
+ path: "../{{ item }}"
+ state: absent
+ with_items:
+ - ansible.pem
+ - ansible.pem.pub
diff --git a/_/ansible/plays/build-arm64.yml b/_/ansible/plays/build-arm64.yml
new file mode 100644
index 0000000..72f332a
--- /dev/null
+++ _/ansible/plays/build-arm64.yml
@@ -0,0 +1,95 @@
+- name: Creating Headless Chromium Configuration
+ copy:
+ content: |
+ import("//build/args/headless.gn")
+ blink_symbol_level = 0
+ dcheck_always_on = false
+ disable_histogram_support = false
+ enable_basic_print_dialog = false
+ enable_keystone_registration_framework = false
+ enable_linux_installer = false
+ enable_media_remoting = false
+ ffmpeg_branding = "Chrome"
+ is_component_build = false
+ is_debug = false
+ is_official_build = true
+ proprietary_codecs = true
+ symbol_level = 0
+ target_os = "linux"
+ use_sysroot = true
+ v8_symbol_level = 0
+ target_cpu="arm64"
+ v8_target_cpu="arm64"
+ dest: /srv/source/chromium/src/out/Headless/arm64/args.gn
+
+- name: Building ARM64 Sysroot
+ shell: |
+ ./build/linux/sysroot_scripts/install-sysroot.py --arch=arm64
+ args:
+ chdir: /srv/source/chromium/src
+
+- name: Generating Headless Chromium Configuration
+ shell: |
+ gn gen out/Headless/arm64
+ args:
+ chdir: /srv/source/chromium/src
+
+- name: Compiling Headless Chromium
+ shell: |
+ autoninja -C out/Headless/arm64 headless_shell
+ args:
+ chdir: /srv/source/chromium/src
+
+- name: Getting Chromium Version
+ shell: |
+ sed --regexp-extended 's~[^0-9]+~~g' chrome/VERSION | tr '\n' '.' | sed 's~[.]$~~'
+ args:
+ chdir: /srv/source/chromium/src
+ register: version
+
+# TODO, switch to binutils
+- name: Striping Symbols from Chromium (arm64) Binary
+ shell: |
+ wget https://releases.linaro.org/components/toolchain/binaries/latest-7/aarch64-linux-gnu/gcc-linaro-7.5.0-2019.12-x86_64_aarch64-linux-gnu.tar.xz
+ tar -xf gcc-linaro-7.5.0-2019.12-x86_64_aarch64-linux-gnu.tar.xz
+ ./gcc-linaro-7.5.0-2019.12-x86_64_aarch64-linux-gnu/bin/aarch64-linux-gnu-strip -o /srv/build/chromium/chromium-{{ version.stdout | quote }} out/Headless/arm64/headless_shell
+ args:
+ chdir: /srv/source/chromium/src
+
+- name: Compressing Chromium
+ shell: |
+ brotli --best --force {{ item }}
+ args:
+ chdir: /srv/build/chromium
+ with_items:
+ - "chromium-{{ version.stdout }}"
+
+- name: Downloading Chromium
+ fetch:
+ src: "/srv/build/chromium/{{ item }}"
+ dest: ../../../bin/arm64/
+ flat: yes
+ fail_on_missing: true
+ with_items:
+ - "chromium-{{ version.stdout }}.br"
+
+- name: Archiving OpenGL ES driver
+ shell: |
+ tar --directory /srv/source/chromium/src/out/Headless/arm64 --create --file swiftshader.tar libEGL.so libGLESv2.so libvk_swiftshader.so libvulkan.so.1 vk_swiftshader_icd.json
+ args:
+ chdir: /srv/build/chromium
+ creates: /srv/build/chromium/swiftshader.tar
+
+- name: Compressing OpenGL ES driver
+ shell: |
+ brotli --best --force swiftshader.tar
+ args:
+ chdir: /srv/build/chromium
+ creates: /srv/build/chromium/swiftshader.tar.br
+
+- name: Downloading OpenGL ES driver
+ fetch:
+ src: /srv/build/chromium/swiftshader.tar.br
+ dest: ../../../bin/arm64/
+ flat: yes
+ fail_on_missing: true
diff --git a/_/ansible/plays/build-x64.yml b/_/ansible/plays/build-x64.yml
new file mode 100644
index 0000000..ef3d8ae
--- /dev/null
+++ _/ansible/plays/build-x64.yml
@@ -0,0 +1,110 @@
+- name: Creating Headless Chromium Configuration
+ copy:
+ content: |
+ import("//build/args/headless.gn")
+ blink_symbol_level = 0
+ dcheck_always_on = false
+ disable_histogram_support = false
+ enable_basic_print_dialog = false
+ enable_keystone_registration_framework = false
+ enable_linux_installer = false
+ enable_media_remoting = false
+ ffmpeg_branding = "Chrome"
+ is_component_build = false
+ is_debug = false
+ is_official_build = true
+ proprietary_codecs = true
+ symbol_level = 0
+ target_os = "linux"
+ use_sysroot = true
+ v8_symbol_level = 0
+ target_cpu="x64"
+ v8_target_cpu="x64"
+ dest: /srv/source/chromium/src/out/Headless/x64/args.gn
+
+- name: Generating Headless Chromium Configuration
+ shell: |
+ gn gen out/Headless/x64
+ args:
+ chdir: /srv/source/chromium/src
+
+- name: Compiling Headless Chromium
+ shell: |
+ autoninja -C out/Headless/x64 headless_shell
+ args:
+ chdir: /srv/source/chromium/src
+
+- name: Getting Chromium Version
+ shell: |
+ sed --regexp-extended 's~[^0-9]+~~g' chrome/VERSION | tr '\n' '.' | sed 's~[.]$~~'
+ args:
+ chdir: /srv/source/chromium/src
+ register: version
+
+- name: Striping Symbols from Chromium (x64) Binary
+ shell: |
+ strip -o /srv/build/chromium/chromium-{{ version.stdout | quote }} out/Headless/x64/headless_shell
+ args:
+ chdir: /srv/source/chromium/src
+
+- name: Compressing Chromium
+ shell: |
+ brotli --best --force {{ item }}
+ args:
+ chdir: /srv/build/chromium
+ with_items:
+ - "chromium-{{ version.stdout }}"
+
+- name: Downloading Chromium
+ fetch:
+ src: "/srv/build/chromium/{{ item }}"
+ dest: ../../../bin/x64/
+ flat: yes
+ fail_on_missing: true
+ with_items:
+ - "chromium-{{ version.stdout }}.br"
+
+- name: Archiving OpenGL ES driver
+ shell: |
+ tar --directory /srv/source/chromium/src/out/Headless/x64 --create --file swiftshader.tar libEGL.so libGLESv2.so libvk_swiftshader.so libvulkan.so.1 vk_swiftshader_icd.json
+ args:
+ chdir: /srv/build/chromium
+ creates: /srv/build/chromium/swiftshader.tar
+
+- name: Compressing OpenGL ES driver
+ shell: |
+ brotli --best --force swiftshader.tar
+ args:
+ chdir: /srv/build/chromium
+ creates: /srv/build/chromium/swiftshader.tar.br
+
+- name: Downloading OpenGL ES driver
+ fetch:
+ src: /srv/build/chromium/swiftshader.tar.br
+ dest: ../../../bin/x64/
+ flat: yes
+ fail_on_missing: true
+
+- name: Creating AL2023 Package
+ shell: |
+ tar --directory /usr/lib64 --create --file al2023.tar \
+ --transform='s,^libexpat\.so\.1\.9\.3$,libexpat.so.1,' \
+ --transform='s,^,lib/,' \
+ libexpat.so.1.9.3 libfreebl3.so libfreeblpriv3.so libnspr4.so libnss3.so libnssutil3.so libplc4.so libplds4.so libsoftokn3.so libfreebl3.chk libfreeblpriv3.chk libsoftokn3.chk
+ args:
+ chdir: /srv/lib
+ creates: /srv/lib/al2023.tar
+
+- name: Compressing AL2023 Package
+ shell: |
+ brotli --best --force al2023.tar
+ args:
+ chdir: /srv/lib
+ creates: /srv/lib/al2023.tar.br
+
+- name: Downloading AL2023 Package
+ fetch:
+ src: /srv/lib/al2023.tar.br
+ dest: ../../../bin/x64/
+ flat: yes
+ fail_on_missing: true
diff --git _/ansible/plays/chromium.yml _/ansible/plays/chromium.yml
index 6aafe1e..35af686 100644
--- _/ansible/plays/chromium.yml
+++ _/ansible/plays/chromium.yml
@@ -16,7 +16,7 @@
amazon.aws.ec2_key:
name: ansible
state: present
- region: "{{ region }}"
+ region: "{{ aws_region }}"
key_material: "{{ item }}"
with_file: ../ansible.pem.pub
@@ -25,7 +25,7 @@
name: Chromium
description: SSH Access
state: present
- region: "{{ region }}"
+ region: "{{ aws_region }}"
rules:
- proto: tcp
to_port: 22
@@ -40,26 +40,18 @@
count: 1
ebs_optimized: yes
image:
- id: "{{ image }}"
+ id: "{{ lookup('amazon.aws.ssm_parameter', '/aws/service/ami-amazon-linux-latest/al2023-ami-kernel-6.1-x86_64') }}"
instance_initiated_shutdown_behavior: terminate
- instance_type: "{{ instance_size }}"
+ instance_type: "{{ x64_instance }}"
key_name: ansible
- network:
- assign_public_ip: yes
- delete_on_termination: yes
- groups: Chromium
- region: "{{ region }}"
+ network_interfaces:
+ - assign_public_ip: yes
+ groups: Chromium
+ region: "{{ aws_region }}"
security_group: Chromium
state: present
tags:
Name: Chromium
- volumes:
- - device_name: /dev/xvda
- ebs:
- delete_on_termination: true
- volume_type: io2
- volume_size: 128
- iops: 3000
register: ec2
- name: Registering Host
@@ -84,11 +76,19 @@
PATH: "{{ ansible_env.PATH }}:/srv/source/depot_tools"
tasks:
+ - name: Mount NVME drive
+ become: true
+ become_user: root
+ shell: |
+ mkfs -t ext4 -m 0 /dev/nvme1n1
+ echo "/dev/nvme1n1 /srv ext4 defaults,noatime,nofail 0 2" >> /etc/fstab
+ mount -a
+
- name: Update system
become: true
become_user: root
shell: |
- dnf update --releasever=2022.0.20221207 -y
+ dnf update -y
- name: Installing Packages
become: true
@@ -156,6 +156,7 @@
with_items:
- build
- source
+ - lib
when: structure.stat.exists != true
- name: Cloning Depot Tools
@@ -186,12 +187,11 @@
- name: Parse Result
set_fact:
- gitsha: >
- {{ revision.content | regex_search('"git_sha":"([a-zA-Z0-9_]*)"', '\1') | trim }}
+ gitsha: "{{ revision.content | regex_search('\"git_sha\":\"([a-f0-9]+)\"', '\\1') | first }}"
- name: Checking Out Chromium revision
shell: |
- gclient sync --delete_unversioned_trees --revision {{ gitsha | first }} --with_branch_heads
+ gclient sync --force --reset --delete_unversioned_trees --revision {{ gitsha }} --with_branch_heads
args:
chdir: /srv/source/chromium
@@ -233,102 +233,19 @@
- name: Creating Build Configuration Directory
file:
mode: 0755
- path: /srv/source/chromium/src/out/Headless
+ path: /srv/source/chromium/src/out/Headless/{{ item }}
state: directory
-
- - name: Mounting Build Directory in Memory
- become: true
- become_user: root
- shell: |
- mount --types tmpfs --options size=24G,nr_inodes=128k,mode=1777 tmpfs /srv/source/chromium/src/out/Headless
-
- - name: Creating Headless Chromium Configuration
- copy:
- content: |
- import("//build/args/headless.gn")
- blink_symbol_level = 0
- dcheck_always_on = false
- disable_histogram_support = false
- enable_basic_print_dialog = false
- enable_basic_printing = true
- enable_keystone_registration_framework = false
- enable_linux_installer = false
- enable_media_remoting = false
- ffmpeg_branding = "Chrome"
- is_component_build = false
- is_debug = false
- is_official_build = true
- proprietary_codecs = true
- symbol_level = 0
- target_cpu = "x64"
- target_os = "linux"
- use_sysroot = true
- v8_symbol_level = 0
- v8_target_cpu = "x64"
- dest: /srv/source/chromium/src/out/Headless/args.gn
-
- - name: Generating Headless Chromium Configuration
- shell: |
- gn gen out/Headless
- args:
- chdir: /srv/source/chromium/src
-
- - name: Compiling Headless Chromium
- shell: |
- autoninja -C out/Headless headless_shell
- args:
- chdir: /srv/source/chromium/src
-
- - name: Getting Chromium Version
- shell: |
- sed --regexp-extended 's~[^0-9]+~~g' chrome/VERSION | tr '\n' '.' | sed 's~[.]$~~'
- args:
- chdir: /srv/source/chromium/src
- register: version
-
- - name: Striping Symbols from Chromium Binary
- shell: |
- strip -o /srv/build/chromium/chromium-{{ version.stdout | quote }} out/Headless/headless_shell
- args:
- chdir: /srv/source/chromium/src
-
- - name: Compressing Chromium
- shell: |
- brotli --best --force {{ item }}
- args:
- chdir: /srv/build/chromium
- with_items:
- - "chromium-{{ version.stdout }}"
-
- - name: Downloading Chromium
- fetch:
- src: "/srv/build/chromium/{{ item }}"
- dest: ../../../bin/
- flat: yes
- fail_on_missing: true
with_items:
- - "chromium-{{ version.stdout }}.br"
-
- - name: Archiving OpenGL ES driver
- shell: |
- tar --directory /srv/source/chromium/src/out/Headless --create --file swiftshader.tar libEGL.so libGLESv2.so libvk_swiftshader.so libvulkan.so.1 vk_swiftshader_icd.json
- args:
- chdir: /srv/build/chromium
- creates: /srv/build/chromium/swiftshader.tar
+ - x64
+ - arm64
- - name: Compressing OpenGL ES driver
- shell: |
- brotli --best --force swiftshader.tar
- args:
- chdir: /srv/build/chromium
- creates: /srv/build/chromium/swiftshader.tar.br
+ - name: Compile and package Chromium for x64
+ include_tasks: build-x64.yml
+ when: "'x64' in archs"
- - name: Downloading OpenGL ES driver
- fetch:
- src: /srv/build/chromium/swiftshader.tar.br
- dest: ../../../bin/
- flat: yes
- fail_on_missing: true
+ - name: Compile and package Chromium for arm64
+ include_tasks: build-arm64.yml
+ when: "'arm64' in archs"
- name: Teardown AWS
hosts: localhost
@@ -340,19 +257,19 @@
wait: yes
state: absent
instance_ids: "{{ ec2.instance_ids }}"
- region: "{{ region }}"
+ region: "{{ aws_region }}"
- name: Deleting Security Group
amazon.aws.ec2_group:
name: Chromium
state: absent
- region: "{{ region }}"
+ region: "{{ aws_region }}"
- name: Deleting EC2 Key Pair
amazon.aws.ec2_key:
name: ansible
state: absent
- region: "{{ region }}"
+ region: "{{ aws_region }}"
- name: Deleting SSH Key
file:
diff --git a/bin/arm64/al2023.tar.br b/bin/arm64/al2023.tar.br
new file mode 100644
index 0000000..c798db5
Binary files /dev/null and b/bin/arm64/al2023.tar.br differ
diff --git a/bin/chromium.br b/bin/arm64/chromium.br
similarity index 79%
rename from bin/chromium.br
rename to bin/arm64/chromium.br
index a86c837..aba6556 100755
Binary files a/bin/chromium.br and b/bin/arm64/chromium.br differ
diff --git a/bin/arm64/swiftshader.tar.br b/bin/arm64/swiftshader.tar.br
new file mode 100644
index 0000000..0f9e0d3
Binary files /dev/null and b/bin/arm64/swiftshader.tar.br differ
diff --git a/bin/aws.tar.br b/bin/aws.tar.br
deleted file mode 100644
index ab8e70e..0000000
Binary files a/bin/aws.tar.br and /dev/null differ
diff --git a/bin/fonts.tar.br b/bin/fonts.tar.br
new file mode 100644
index 0000000..dc57612
Binary files /dev/null and b/bin/fonts.tar.br differ
diff --git a/bin/swiftshader.tar.br b/bin/swiftshader.tar.br
deleted file mode 100644
index 576b86b..0000000
Binary files a/bin/swiftshader.tar.br and /dev/null differ
diff --git a/bin/x64/al2023.tar.br b/bin/x64/al2023.tar.br
new file mode 100644
index 0000000..903c61b
Binary files /dev/null and b/bin/x64/al2023.tar.br differ
diff --git a/bin/x64/chromium.br b/bin/x64/chromium.br
new file mode 100755
index 0000000..80ebdb4
Binary files /dev/null and b/bin/x64/chromium.br differ
diff --git a/bin/x64/swiftshader.tar.br b/bin/x64/swiftshader.tar.br
new file mode 100644
index 0000000..5ac03e5
Binary files /dev/null and b/bin/x64/swiftshader.tar.br differ
diff --git a/eslint.config.js b/eslint.config.js
new file mode 100644
index 0000000..1e798a2
--- /dev/null
+++ eslint.config.js
@@ -0,0 +1,31 @@
+// @ts-expect-error I have no types
+import myConfig from "@sparticuz/eslint-config";
+import tseslint from "typescript-eslint";
+
+export default tseslint.config(
+ {
+ name: "Ignores",
+
+ ignores: [
+ "node_modules",
+ "examples",
+ "build",
+ "coverage",
+ "vitest.config.ts",
+ ],
+ },
+ ...myConfig,
+ {
+ languageOptions: {
+ parserOptions: {
+ // I have engines already set to >=20.11
+ // eslint-disable-next-line n/no-unsupported-features/node-builtins
+ tsconfigRootDir: import.meta.dirname,
+ },
+ },
+ rules: {
+ "security/detect-non-literal-fs-filename": "off",
+ "unicorn/prevent-abbreviations": "off",
+ },
+ }
+);
diff --git a/examples/aws-sam/.gitignore b/examples/aws-sam/.gitignore
new file mode 100644
index 0000000..0a03531
--- /dev/null
+++ examples/aws-sam/.gitignore
@@ -0,0 +1 @@
+.aws-sam
diff --git a/examples/aws-sam/README.md b/examples/aws-sam/README.md
new file mode 100644
index 0000000..a004b5f
--- /dev/null
+++ examples/aws-sam/README.md
@@ -0,0 +1,19 @@
+# Chromium as a Layer for AWS SAM
+
+1. Install AWS SAM CLI: https://github.com/aws/aws-sam-cli/
+
+1. Ensure Docker is installed and running: https://www.docker.com/
+
+1. Build the project:
+
+ ```sh
+ sam build
+ ```
+
+1. Invoke the AWS Lambda Function locally with:
+
+ ```sh
+ sam local invoke ExampleFunction
+ ```
+
+ This example connects to https://www.example.com and outputs the page's title as the function result. See the source code in [`app.mjs`](functions/exampleFunction/app.mjs) for more details.
diff --git a/examples/aws-sam/functions/exampleFunction/app.mjs b/examples/aws-sam/functions/exampleFunction/app.mjs
new file mode 100644
index 0000000..de94ba7
--- /dev/null
+++ examples/aws-sam/functions/exampleFunction/app.mjs
@@ -0,0 +1,24 @@
+import chromium from '@sparticuz/chromium';
+import puppeteer from 'puppeteer-core';
+
+export const lambdaHandler = async (event, context) => {
+ const browser = await puppeteer.launch({
+ args: chromium.args,
+ defaultViewport: chromium.defaultViewport,
+ executablePath: await chromium.executablePath(),
+ headless: chromium.headless,
+ });
+
+ const page = await browser.newPage();
+
+ await page.goto("https://www.example.com", { waitUntil: "networkidle0" });
+
+ const browserVersion = await browser.version();
+ const pageTitle = await page.title();
+
+ await page.close();
+
+ await browser.close();
+
+ return { result: 'success', browserVersion, pageTitle };
+}
diff --git a/examples/aws-sam/functions/exampleFunction/package.json b/examples/aws-sam/functions/exampleFunction/package.json
new file mode 100644
index 0000000..bcb1bc6
--- /dev/null
+++ examples/aws-sam/functions/exampleFunction/package.json
@@ -0,0 +1,13 @@
+{
+ "name": "ExampleFunction",
+ "private": true,
+ "version": "0.1.0",
+ "description": "AWS Lambda Function that loads Chromium. Refer to https://github.com/Sparticuz/chromium#install for compatible versions.",
+ "main": "app.mjs",
+ "devDependencies": {
+ "@sparticuz/chromium": "^133.0.0"
+ },
+ "dependencies": {
+ "puppeteer-core": "^24.9.0"
+ }
+}
diff --git a/examples/aws-sam/layers/chromium/package.json b/examples/aws-sam/layers/chromium/package.json
new file mode 100644
index 0000000..1d67003
--- /dev/null
+++ examples/aws-sam/layers/chromium/package.json
@@ -0,0 +1,9 @@
+{
+ "name": "ChromiumLayer",
+ "private": true,
+ "version": "1.0.0",
+ "description": "Chromium layer for AWS Lambda",
+ "dependencies": {
+ "@sparticuz/chromium": "^133.0.0"
+ }
+}
diff --git a/examples/aws-sam/template.yml b/examples/aws-sam/template.yml
new file mode 100644
index 0000000..54332f0
--- /dev/null
+++ examples/aws-sam/template.yml
@@ -0,0 +1,33 @@
+AWSTemplateFormatVersion: "2010-09-09"
+Transform: AWS::Serverless-2016-10-31
+Description: Example configuration for AWS SAM and Chromium
+
+Resources:
+ ChromiumLayer:
+ Type: AWS::Serverless::LayerVersion
+ Properties:
+ Description: Chromium with Node.js integration for AWS Lambda
+ ContentUri: layers/chromium
+ CompatibleRuntimes:
+ - &nodejsRuntime nodejs22.x
+ # Chromium doesn't currently have ARM support; see https://github.com/Sparticuz/chromium#can-i-use-arm-or-graviton-instances
+ CompatibleArchitectures:
+ - &chromiumArch x86_64
+ RetentionPolicy: Delete
+ Metadata:
+ BuildMethod: *nodejsRuntime
+ BuildArchitecture: *chromiumArch
+
+ ExampleFunction:
+ Type: AWS::Serverless::Function
+ Properties:
+ CodeUri: functions/exampleFunction
+ Handler: app.lambdaHandler
+ Runtime: *nodejsRuntime
+ Architectures:
+ - *chromiumArch
+ Layers:
+ - !Ref ChromiumLayer
+ # Adjust as necessary
+ Timeout: 30
+ MemorySize: 1024
diff --git a/examples/serverless-graviton-with-pack/index.js b/examples/serverless-graviton-with-pack/index.js
new file mode 100644
index 0000000..c5aa938
--- /dev/null
+++ examples/serverless-graviton-with-pack/index.js
@@ -0,0 +1,31 @@
+const puppeteer = require("puppeteer-core");
+const chromium = require("@sparticuz/chromium-min");
+
+module.exports = {
+ handler: async () => {
+ try {
+ const browser = await puppeteer.launch({
+ args: chromium.args,
+ defaultViewport: chromium.defaultViewport,
+ executablePath: await chromium.executablePath(
+ "https://github.com/Sparticuz/chromium/releases/download/v135.0.0-next.3/chromium-v135.0.0-next.3-pack.arm64.tar"
+ ),
+ headless: chromium.headless,
+ ignoreHTTPSErrors: true,
+ });
+
+ const page = await browser.newPage();
+
+ await page.goto("https://www.example.com", { waitUntil: "networkidle0" });
+
+ console.log("Chromium:", await browser.version());
+ console.log("Page Title:", await page.title());
+
+ await page.close();
+
+ await browser.close();
+ } catch (error) {
+ throw new Error(error.message);
+ }
+ },
+};
diff --git a/examples/serverless-graviton-with-pack/package.json b/examples/serverless-graviton-with-pack/package.json
new file mode 100644
index 0000000..9e9b20c
--- /dev/null
+++ examples/serverless-graviton-with-pack/package.json
@@ -0,0 +1,21 @@
+{
+ "name": "serverless-graviton-with-pack",
+ "version": "0.0.0",
+ "description": "This package demonstrates using @sparticuz/chromium-min with a pack for Graviton (arm64)",
+ "license": "ISC",
+ "author": {
+ "name": "Kyle McNally"
+ },
+ "main": "index.js",
+ "scripts": {
+ "deploy": "sls deploy",
+ "test": "sls invoke --function chromium-arm-test --log"
+ },
+ "dependencies": {
+ "@sparticuz/chromium-min": "133.0.0",
+ "puppeteer-core": "24.7.2"
+ },
+ "devDependencies": {
+ "serverless": "^3.40.0"
+ }
+}
diff --git a/examples/serverless-graviton-with-pack/serverless.yml b/examples/serverless-graviton-with-pack/serverless.yml
new file mode 100644
index 0000000..debf8f6
--- /dev/null
+++ examples/serverless-graviton-with-pack/serverless.yml
@@ -0,0 +1,13 @@
+service: sls-graviton-with-pack
+
+provider:
+ name: aws
+ runtime: nodejs22.x
+ architecture: arm64
+ stage: dev
+ region: us-east-1
+ timeout: 300
+
+functions:
+ chromium-arm-test:
+ handler: index.handler
diff --git a/fonts/fonts.conf b/fonts/fonts.conf
new file mode 100644
index 0000000..3f8207e
--- /dev/null
+++ fonts/fonts.conf
@@ -0,0 +1,10 @@
+<?xml version="1.0" ?>
+<!DOCTYPE fontconfig SYSTEM "fonts.dtd">
+<fontconfig>
+ <dir>/var/task/.fonts</dir>
+ <dir>/var/task/fonts</dir>
+ <dir>/opt/fonts</dir>
+ <dir>/tmp/fonts</dir>
+ <cachedir>/tmp/fonts-cache/</cachedir>
+ <config></config>
+</fontconfig>
diff --git package.json package.json
index dd3a292..dd049fa 100644
--- package.json
+++ package.json
@@ -1,6 +1,6 @@
{
"name": "@sparticuz/chromium",
- "version": "117.0.0",
+ "version": "143.0.0",
"description": "Chromium Binary for Serverless Platforms",
"keywords": [
"aws",
@@ -24,31 +24,61 @@
"author": {
"name": "Kyle McNally"
},
- "type": "commonjs",
- "main": "build/index.js",
- "types": "build/index.d.ts",
+ "type": "module",
+ "main": "./build/cjs/index.cjs",
+ "module": "./build/esm/index.js",
+ "exports": {
+ ".": {
+ "types": "./build/esm/index.d.ts",
+ "import": {
+ "types": "./build/esm/index.d.ts",
+ "default": "./build/esm/index.js"
+ },
+ "require": {
+ "types": "./build/cjs/index.d.ts",
+ "default": "./build/cjs/index.cjs"
+ }
+ }
+ },
+ "types": "./build/esm/index.d.ts",
"files": [
"bin",
+ "!bin/arm64",
+ "!bin/x64",
"build"
],
"scripts": {
- "build": "rm -rf build && tsc -p tsconfig.json",
- "test": "make clean && make && make pretest && make test"
+ "build": "chmod +x ./tools/build.sh && ./tools/build.sh",
+ "build:fonts": "rm bin/fonts.tar.br && node ./tools/download-open-sans.mjs",
+ "layer:x64": "make chromium-layer.x64.zip",
+ "layer:arm64": "make chromium-layer.arm64.zip",
+ "lint": "eslint",
+ "pack:x64": "make pack-x64",
+ "pack:arm64": "make pack-arm64",
+ "test:integration": "make clean && make && make pretest && make test",
+ "test:source": "make presource && vitest run --coverage && make postsource",
+ "update": "node ./tools/update-browser-revision.mjs"
},
"dependencies": {
- "follow-redirects": "^1.15.2",
- "tar-fs": "^3.0.4"
+ "follow-redirects": "^1.15.11",
+ "tar-fs": "^3.1.1"
},
"devDependencies": {
- "@tsconfig/node16": "^1.0.4",
- "@tsconfig/strictest": "^2.0.1",
- "@types/follow-redirects": "^1.14.1",
- "@types/node": "^18.16.17",
- "@types/tar-fs": "^2.0.1",
- "clean-modules": "^3.0.4",
- "typescript": "^5.1.6"
+ "@sparticuz/eslint-config": "^10.0.2",
+ "@tsconfig/node20": "^20.1.8",
+ "@tsconfig/strictest": "^2.0.8",
+ "@types/follow-redirects": "^1.14.4",
+ "@types/node": "^20.19.25",
+ "@types/tar-fs": "^2.0.4",
+ "@vitest/coverage-v8": "^4.0.15",
+ "clean-modules": "^3.1.1",
+ "eslint": "^9.39.1",
+ "puppeteer-core": "^24.32.0",
+ "typescript": "^5.9.3",
+ "typescript-eslint": "^8.48.1",
+ "vitest": "^4.0.15"
},
"engines": {
- "node": ">= 16"
+ "node": ">=20.11.0"
}
}
diff --git source/helper.ts source/helper.ts
index 97cc36d..6a14092 100644
--- source/helper.ts
+++ source/helper.ts
@@ -1,59 +1,214 @@
-import { unlink } from "node:fs";
-import { https } from "follow-redirects";
+import fr from "follow-redirects";
+import { access, createWriteStream, rm, symlink } from "node:fs";
import { tmpdir } from "node:os";
+import { join } from "node:path";
import { extract } from "tar-fs";
-import { parse } from "node:url";
-import type { UrlWithStringQuery } from "node:url";
-interface FollowRedirOptions extends UrlWithStringQuery {
+interface FollowRedirOptions extends URL {
maxBodyLength: number;
}
+/**
+ * Creates a symlink to a file
+ */
+export const createSymlink = (
+ source: string,
+ target: string
+): Promise<void> => {
+ return new Promise((resolve, reject) => {
+ access(source, (error) => {
+ if (error) {
+ reject(error);
+ return;
+ }
+ symlink(source, target, (error) => {
+ /* c8 ignore next */
+ if (error) {
+ /* c8 ignore next 3 */
+ reject(error);
+ return;
+ }
+ resolve();
+ });
+ });
+ });
+};
+
+/**
+ * Downloads a file from a URL
+ */
+export const downloadFile = (
+ url: string,
+ outputPath: string
+): Promise<void> => {
+ return new Promise((resolve, reject) => {
+ const stream = createWriteStream(outputPath);
+ stream.once("error", reject);
+
+ fr.https
+ .get(url, (response) => {
+ if (response.statusCode !== 200) {
+ stream.close();
+ reject(
+ new Error(
+ /* c8 ignore next 2 */
+ `Unexpected status code: ${
+ response.statusCode?.toFixed(0) ?? "UNK"
+ }.`
+ )
+ );
+ return;
+ }
+
+ // Pipe directly to file rather than manually writing chunks
+ // This is more efficient and uses less memory
+ response.pipe(stream);
+
+ // Listen for completion
+ stream.once("finish", () => {
+ stream.close();
+ resolve();
+ });
+
+ // Handle response errors
+ response.once("error", (error) => {
+ /* c8 ignore next 2 */
+ stream.close();
+ reject(error);
+ });
+ })
+ /* c8 ignore next 3 */
+ .on("error", (error) => {
+ stream.close();
+ reject(error);
+ });
+ });
+};
+
+/**
+ * Adds the proper folders to the environment
+ * @param baseLibPath the path to this packages lib folder
+ */
+export const setupLambdaEnvironment = (baseLibPath: string) => {
+ // If the FONTCONFIG_PATH is not set, set it to /tmp/fonts
+ process.env["FONTCONFIG_PATH"] ??= join(tmpdir(), "fonts");
+ // Set up Home folder if not already set
+ process.env["HOME"] ??= tmpdir();
+
+ // If LD_LIBRARY_PATH is undefined, set it to baseLibPath, otherwise, add it
+ if (process.env["LD_LIBRARY_PATH"] === undefined) {
+ process.env["LD_LIBRARY_PATH"] = baseLibPath;
+ } else if (!process.env["LD_LIBRARY_PATH"].startsWith(baseLibPath)) {
+ process.env["LD_LIBRARY_PATH"] = [
+ baseLibPath,
+ ...new Set(process.env["LD_LIBRARY_PATH"].split(":")),
+ ].join(":");
+ }
+};
+
+/**
+ * Determines if the input is a valid URL
+ * @param input the input to check
+ * @returns boolean indicating if the input is a valid URL
+ */
export const isValidUrl = (input: string) => {
try {
- return !!new URL(input);
- } catch (err) {
+ return Boolean(new URL(input));
+ } catch {
return false;
}
};
/**
- * Determines if the running instance is inside an AWS Lambda container.
+ * Determines if the running instance is inside an Amazon Linux 2023 container,
* AWS_EXECUTION_ENV is for native Lambda instances
* AWS_LAMBDA_JS_RUNTIME is for netlify instances
- * @returns boolean indicating if the running instance is inside a Lambda container
+ * CODEBUILD_BUILD_IMAGE is for CodeBuild instances
+ * VERCEL is for Vercel Functions (Node 20 or later enables an AL2023-compatible environment).
+ * @returns boolean indicating if the running instance is inside a Lambda container with nodejs20
*/
-export const isRunningInAwsLambda = () => {
+export const isRunningInAmazonLinux2023 = (nodeMajorVersion: number) => {
+ const awsExecEnv = process.env["AWS_EXECUTION_ENV"] ?? "";
+ const awsLambdaJsRuntime = process.env["AWS_LAMBDA_JS_RUNTIME"] ?? "";
+ const codebuildImage = process.env["CODEBUILD_BUILD_IMAGE"] ?? "";
+
+ // Check for explicit version substrings, returns on first match
if (
- process.env["AWS_EXECUTION_ENV"] &&
- /^AWS_Lambda_nodejs/.test(process.env["AWS_EXECUTION_ENV"]) === true
+ awsExecEnv.includes("20.x") ||
+ awsExecEnv.includes("22.x") ||
+ awsExecEnv.includes("24.x") ||
+ awsLambdaJsRuntime.includes("20.x") ||
+ awsLambdaJsRuntime.includes("22.x") ||
+ awsLambdaJsRuntime.includes("24.x") ||
+ codebuildImage.includes("nodejs20") ||
+ codebuildImage.includes("nodejs22") ||
+ codebuildImage.includes("nodejs24")
) {
return true;
- } else if (
- process.env["AWS_LAMBDA_JS_RUNTIME"] &&
- /^nodejs/.test(process.env["AWS_LAMBDA_JS_RUNTIME"]) === true
- ) {
+ }
+
+ // Vercel: Node 20+ is AL2023 compatible
+ if (process.env["VERCEL"] && nodeMajorVersion >= 20) {
return true;
}
+
return false;
};
-export const downloadAndExtract = async (url: string) =>
- new Promise<string>((resolve, reject) => {
- const getOptions = parse(url) as FollowRedirOptions;
- getOptions.maxBodyLength = 60 * 1024 * 1024; // 60mb
- const destDir = `${tmpdir()}/chromium-pack`;
+export const downloadAndExtract = async (url: string) => {
+ const getOptions = new URL(url) as FollowRedirOptions;
+ // Increase the max body length to 60MB for larger files
+ getOptions.maxBodyLength = 60 * 1024 * 1024;
+ const destDir = join(tmpdir(), "chromium-pack");
+
+ return new Promise<string>((resolve, reject) => {
const extractObj = extract(destDir);
- https
- .get(url, (response) => {
- response.pipe(extractObj);
- extractObj.on("finish", () => {
- resolve(destDir);
- });
- })
- .on("error", (err) => {
- unlink(destDir, (_) => {
- reject(err);
- });
+
+ // Setup error handlers for better cleanup
+ /* c8 ignore next 5 */
+ const cleanupOnError = (err: Error) => {
+ rm(destDir, { force: true, recursive: true }, () => {
+ reject(err);
});
+ };
+
+ // Attach error handler to extract stream
+ extractObj.once("error", cleanupOnError);
+
+ // Handle extraction completion
+ extractObj.once("finish", () => {
+ resolve(destDir);
+ });
+
+ const req = fr.https.get(url, (response) => {
+ /* c8 ignore next */
+ if (response.statusCode !== 200) {
+ /* c8 ignore next 9 */
+ reject(
+ new Error(
+ `Unexpected status code: ${
+ response.statusCode?.toFixed(0) ?? "UNK"
+ }.`
+ )
+ );
+ return;
+ }
+
+ // Pipe the response directly to the extraction stream
+ response.pipe(extractObj);
+
+ // Handle response errors
+ response.once("error", cleanupOnError);
+ });
+
+ // Handle request errors
+ req.once("error", cleanupOnError);
+
+ // Set a timeout to avoid hanging requests
+ req.setTimeout(60 * 1000, () => {
+ /* c8 ignore next 2 */
+ req.destroy();
+ cleanupOnError(new Error("Request timeout"));
+ });
});
+};
diff --git source/index.ts source/index.ts
index 79a5e63..6833741 100644
--- source/index.ts
+++ source/index.ts
@@ -1,248 +1,87 @@
-import {
- access,
- createWriteStream,
- existsSync,
- mkdirSync,
- symlink,
-} from "node:fs";
-import { https } from "follow-redirects";
-import LambdaFS from "./lambdafs";
+import { existsSync, mkdirSync } from "node:fs";
+import { tmpdir } from "node:os";
import { join } from "node:path";
import { URL } from "node:url";
-import { downloadAndExtract, isRunningInAwsLambda, isValidUrl } from "./helper";
-
-/** Viewport taken from https://github.com/puppeteer/puppeteer/blob/main/docs/api/puppeteer.viewport.md */
-interface Viewport {
- /**
- * The page width in pixels.
- */
- width: number;
- /**
- * The page height in pixels.
- */
- height: number;
- /**
- * Specify device scale factor.
- * See {@link https://developer.mozilla.org/en-US/docs/Web/API/Window/devicePixelRatio | devicePixelRatio} for more info.
- * @default 1
- */
- deviceScaleFactor?: number;
- /**
- * Whether the `meta viewport` tag is taken into account.
- * @default false
- */
- isMobile?: boolean;
- /**
- * Specifies if the viewport is in landscape mode.
- * @default false
- */
- isLandscape?: boolean;
- /**
- * Specify if the viewport supports touch events.
- * @default false
- */
- hasTouch?: boolean;
-}
-if (isRunningInAwsLambda()) {
- if (process.env["FONTCONFIG_PATH"] === undefined) {
- process.env["FONTCONFIG_PATH"] = "/tmp/aws";
- }
-
- if (process.env["LD_LIBRARY_PATH"] === undefined) {
- process.env["LD_LIBRARY_PATH"] = "/tmp/aws/lib";
- } else if (
- process.env["LD_LIBRARY_PATH"].startsWith("/tmp/aws/lib") !== true
- ) {
- process.env["LD_LIBRARY_PATH"] = [
- ...new Set([
- "/tmp/aws/lib",
- ...process.env["LD_LIBRARY_PATH"].split(":"),
- ]),
- ].join(":");
- }
+import {
+ createSymlink,
+ downloadAndExtract,
+ downloadFile,
+ isRunningInAmazonLinux2023,
+ isValidUrl,
+ setupLambdaEnvironment,
+} from "./helper.js";
+import { inflate } from "./lambdafs.js";
+import { getBinPath } from "./paths.esm.js";
+
+const nodeMajorVersion = Number.parseInt(
+ process.versions.node.split(".")[0] ?? ""
+);
+
+// Setup the lambda environment
+if (isRunningInAmazonLinux2023(nodeMajorVersion)) {
+ setupLambdaEnvironment(join(tmpdir(), "al2023", "lib"));
}
+// eslint-disable-next-line @typescript-eslint/no-extraneous-class
class Chromium {
- /**
- * Determines the headless mode that chromium will run at
- * https://developer.chrome.com/articles/new-headless/#try-out-the-new-headless
- * @values true or "new"
- */
- private static headlessMode: true | "new" = "new";
-
- /**
- * If true, the graphics stack and webgl is enabled,
- * If false, webgl will be disabled.
- * (If false, the swiftshader.tar.br file will also not extract)
- */
- private static graphicsMode: boolean = true;
-
- /**
- * Downloads or symlinks a custom font and returns its basename, patching the environment so that Chromium can find it.
- */
- static font(input: string): Promise<string> {
- if (process.env["HOME"] === undefined) {
- process.env["HOME"] = "/tmp";
- }
-
- if (existsSync(`${process.env["HOME"]}/.fonts`) !== true) {
- mkdirSync(`${process.env["HOME"]}/.fonts`);
- }
-
- return new Promise((resolve, reject) => {
- if (/^https?:[/][/]/i.test(input) !== true) {
- input = `file://${input}`;
- }
-
- const url = new URL(input);
- const output = `${process.env["HOME"]}/.fonts/${url.pathname
- .split("/")
- .pop()}`;
-
- if (existsSync(output) === true) {
- return resolve(output.split("/").pop() as string);
- }
-
- if (url.protocol === "file:") {
- access(url.pathname, (error) => {
- if (error != null) {
- return reject(error);
- }
-
- symlink(url.pathname, output, (error) => {
- return error != null
- ? reject(error)
- : resolve(url.pathname.split("/").pop() as string);
- });
- });
- } else {
- https.get(input, (response) => {
- if (response.statusCode !== 200) {
- return reject(`Unexpected status code: ${response.statusCode}.`);
- }
-
- const stream = createWriteStream(output);
-
- stream.once("error", (error) => {
- return reject(error);
- });
-
- response.on("data", (chunk) => {
- stream.write(chunk);
- });
-
- response.once("end", () => {
- stream.end(() => {
- return resolve(url.pathname.split("/").pop() as string);
- });
- });
- });
- }
- });
- }
-
/**
* Returns a list of additional Chromium flags recommended for serverless environments.
* The canonical list of flags can be found on https://peter.sh/experiments/chromium-command-line-switches/.
+ * Most of below can be found here: https://github.com/GoogleChrome/chrome-launcher/blob/main/docs/chrome-flags-for-tools.md
*/
static get args(): string[] {
- /**
- * These are the default args in puppeteer.
- * https://github.com/puppeteer/puppeteer/blob/3a31070d054fa3cd8116ca31c578807ed8d6f987/packages/puppeteer-core/src/node/ChromeLauncher.ts#L185
- */
- const puppeteerFlags = [
- "--allow-pre-commit-input",
- "--disable-background-networking",
- "--disable-background-timer-throttling",
- "--disable-backgrounding-occluded-windows",
- "--disable-breakpad",
- "--disable-client-side-phishing-detection",
- "--disable-component-extensions-with-background-pages",
- "--disable-component-update",
- "--disable-default-apps",
- "--disable-dev-shm-usage",
- "--disable-extensions",
- "--disable-hang-monitor",
- "--disable-ipc-flooding-protection",
- "--disable-popup-blocking",
- "--disable-prompt-on-repost",
- "--disable-renderer-backgrounding",
- "--disable-sync",
- "--enable-automation",
- // TODO(sadym): remove '--enable-blink-features=IdleDetection' once
- // IdleDetection is turned on by default.
- "--enable-blink-features=IdleDetection",
- "--export-tagged-pdf",
- "--force-color-profile=srgb",
- "--metrics-recording-only",
- "--no-first-run",
- "--password-store=basic",
- "--use-mock-keychain",
- ];
- const puppeteerDisableFeatures = [
- "Translate",
- "BackForwardCache",
- // AcceptCHFrame disabled because of crbug.com/1348106.
- "AcceptCHFrame",
- "MediaRouter",
- "OptimizationHints",
- ];
- const puppeteerEnableFeatures = ["NetworkServiceInProcess2"];
-
const chromiumFlags = [
- "--disable-domain-reliability", // https://github.com/GoogleChrome/chrome-launcher/blob/main/docs/chrome-flags-for-tools.md#background-networking
+ "--ash-no-nudges", // Avoids blue bubble "user education" nudges (eg., "… give your browser a new look", Memory Saver)
+ "--disable-domain-reliability", // Disables Domain Reliability Monitoring, which tracks whether the browser has difficulty contacting Google-owned sites and uploads reports to Google.
"--disable-print-preview", // https://source.chromium.org/search?q=lang:cpp+symbol:kDisablePrintPreview&ss=chromium
- "--disable-speech-api", // https://source.chromium.org/search?q=lang:cpp+symbol:kDisableSpeechAPI&ss=chromium
- "--disk-cache-size=33554432", // https://source.chromium.org/search?q=lang:cpp+symbol:kDiskCacheSize&ss=chromium
- "--mute-audio", // https://source.chromium.org/search?q=lang:cpp+symbol:kMuteAudio&ss=chromium
- "--no-default-browser-check", // https://source.chromium.org/search?q=lang:cpp+symbol:kNoDefaultBrowserCheck&ss=chromium
- "--no-pings", // https://source.chromium.org/search?q=lang:cpp+symbol:kNoPings&ss=chromium
- "--single-process", // Needs to be single-process to avoid `prctl(PR_SET_NO_NEW_PRIVS) failed` error
+ "--disk-cache-size=33554432", // https://source.chromium.org/search?q=lang:cpp+symbol:kDiskCacheSize&ss=chromium Forces the maximum disk space to be used by the disk cache, in bytes.
+ "--no-default-browser-check", // Disable the default browser check, do not prompt to set it as such. (This is already set by Playwright, but not Puppeteer)
+ "--no-pings", // Don't send hyperlink auditing pings
+ "--single-process", // Runs the renderer and plugins in the same process as the browser. NOTES: Needs to be single-process to avoid `prctl(PR_SET_NO_NEW_PRIVS) failed` error
+ "--font-render-hinting=none", // https://github.com/puppeteer/puppeteer/issues/2410#issuecomment-560573612
];
const chromiumDisableFeatures = [
"AudioServiceOutOfProcess",
"IsolateOrigins",
- "site-per-process",
+ "site-per-process", // Disables OOPIF. https://www.chromium.org/Home/chromium-security/site-isolation
];
const chromiumEnableFeatures = ["SharedArrayBuffer"];
const graphicsFlags = [
- "--hide-scrollbars", // https://source.chromium.org/search?q=lang:cpp+symbol:kHideScrollbars&ss=chromium
"--ignore-gpu-blocklist", // https://source.chromium.org/search?q=lang:cpp+symbol:kIgnoreGpuBlocklist&ss=chromium
- "--in-process-gpu", // https://source.chromium.org/search?q=lang:cpp+symbol:kInProcessGPU&ss=chromium
- "--window-size=1920,1080", // https://source.chromium.org/search?q=lang:cpp+symbol:kWindowSize&ss=chromium
+ "--in-process-gpu", // Saves some memory by moving GPU process into a browser process thread
];
// https://chromium.googlesource.com/chromium/src/+/main/docs/gpu/swiftshader.md
- this.graphics
- ? graphicsFlags.push("--use-gl=angle", "--use-angle=swiftshader")
- : graphicsFlags.push("--disable-webgl");
+ if (this.graphics) {
+ graphicsFlags.push(
+ // As the unsafe WebGL fallback, SwANGLE (ANGLE + SwiftShader Vulkan)
+ "--use-gl=angle",
+ "--use-angle=swiftshader",
+ "--enable-unsafe-swiftshader"
+ );
+ } else {
+ graphicsFlags.push("--disable-webgl");
+ }
const insecureFlags = [
"--allow-running-insecure-content", // https://source.chromium.org/search?q=lang:cpp+symbol:kAllowRunningInsecureContent&ss=chromium
- "--disable-setuid-sandbox", // https://source.chromium.org/search?q=lang:cpp+symbol:kDisableSetuidSandbox&ss=chromium
+ "--disable-setuid-sandbox", // Lambda runs as root, so this is required to allow Chromium to run as root
"--disable-site-isolation-trials", // https://source.chromium.org/search?q=lang:cpp+symbol:kDisableSiteIsolation&ss=chromium
"--disable-web-security", // https://source.chromium.org/search?q=lang:cpp+symbol:kDisableWebSecurity&ss=chromium
- "--no-sandbox", // https://source.chromium.org/search?q=lang:cpp+symbol:kNoSandbox&ss=chromium
- "--no-zygote", // https://source.chromium.org/search?q=lang:cpp+symbol:kNoZygote&ss=chromium
];
const headlessFlags = [
- this.headless === "new" ? "--headless='new'" : "--headless",
+ "--headless='shell'", // We only support running chrome-headless-shell
+ "--no-sandbox", // https://source.chromium.org/search?q=lang:cpp+symbol:kNoSandbox&ss=chromium
+ "--no-zygote", // https://source.chromium.org/search?q=lang:cpp+symbol:kNoZygote&ss=chromium
];
return [
- ...puppeteerFlags,
...chromiumFlags,
- `--disable-features=${[
- ...puppeteerDisableFeatures,
- ...chromiumDisableFeatures,
- ].join(",")}`,
- `--enable-features=${[
- ...puppeteerEnableFeatures,
- ...chromiumEnableFeatures,
- ].join(",")}`,
+ `--disable-features=${[...chromiumDisableFeatures].join(",")}`,
+ `--enable-features=${[...chromiumEnableFeatures].join(",")}`,
...graphicsFlags,
...insecureFlags,
...headlessFlags,
@@ -250,19 +89,35 @@ class Chromium {
}
/**
- * Returns sensible default viewport settings for serverless environments.
+ * Returns whether the graphics stack is enabled or disabled
+ * @returns boolean
+ */
+ public static get graphics() {
+ return this.graphicsMode;
+ }
+
+ /**
+ * Sets whether the graphics stack is enabled or disabled.
+ * @param true means the stack is enabled. WebGL will work.
+ * @param false means that the stack is disabled. WebGL will not work.
+ * @default true
*/
- static get defaultViewport(): Required<Viewport> {
- return {
- deviceScaleFactor: 1,
- hasTouch: false,
- height: 1080,
- isLandscape: true,
- isMobile: false,
- width: 1920,
- };
+ public static set setGraphicsMode(value: boolean) {
+ if (typeof value !== "boolean") {
+ throw new TypeError(
+ `Graphics mode must be a boolean, you entered '${String(value)}'`
+ );
+ }
+ this.graphicsMode = value;
}
+ /**
+ * If true, the graphics stack and webgl is enabled,
+ * If false, webgl will be disabled.
+ * (If false, the swiftshader.tar.br file will also not extract)
+ */
+ private static graphicsMode = true;
+
/**
* Inflates the included version of Chromium
* @param input The location of the `bin` folder
@@ -272,8 +127,8 @@ class Chromium {
/**
* If the `chromium` binary already exists in /tmp/chromium, return it.
*/
- if (existsSync("/tmp/chromium") === true) {
- return Promise.resolve("/tmp/chromium");
+ if (existsSync(join(tmpdir(), "chromium"))) {
+ return join(tmpdir(), "chromium");
}
/**
@@ -287,88 +142,89 @@ class Chromium {
/**
* If input is defined, use that as the location of the brotli files,
- * otherwise, the default location is ../bin.
+ * otherwise, the default location is ../../bin.
* A custom location is needed for workflows that using custom packaging.
*/
- input ??= join(__dirname, "..", "bin");
+ input ??= getBinPath();
/**
* If the input directory doesn't exist, throw an error.
*/
if (!existsSync(input)) {
- throw new Error(`The input directory "${input}" does not exist.`);
+ throw new Error(
+ `The input directory "${input}" does not exist. Please provide the location of the brotli files.`
+ );
}
// Extract the required files
- const promises = [LambdaFS.inflate(`${input}/chromium.br`)];
- if (this.graphics) {
- // Only inflate graphics stack if needed
- promises.push(LambdaFS.inflate(`${input}/swiftshader.tar.br`));
- }
- if (isRunningInAwsLambda()) {
- // If running in AWS Lambda, extract more required files
- promises.push(LambdaFS.inflate(`${input}/aws.tar.br`));
+ const promises = [
+ inflate(join(input, "chromium.br")),
+ inflate(join(input, "fonts.tar.br")),
+ inflate(join(input, "swiftshader.tar.br")),
+ ];
+ if (isRunningInAmazonLinux2023(nodeMajorVersion)) {
+ promises.push(inflate(join(input, "al2023.tar.br")));
}
// Await all extractions
const result = await Promise.all(promises);
// Returns the first result of the promise, which is the location of the `chromium` binary
- return result.shift() as string;
+ // eslint-disable-next-line @typescript-eslint/no-non-null-assertion
+ return result.shift()!;
}
/**
- * Returns the headless mode.
- * `true` means the 'old' (legacy, chromium < 112) headless mode.
- * "new" means the 'new' headless mode.
- * https://developer.chrome.com/articles/new-headless/#try-out-the-new-headless
- * @returns true | "new"
+ * Downloads or symlinks a custom font and returns its basename, patching the environment so that Chromium can find it.
*/
- public static get headless() {
- return this.headlessMode;
- }
+ static async font(input: string): Promise<string> {
+ const fontsDir =
+ process.env["FONTCONFIG_PATH"] ??
+ join(process.env["HOME"] ?? tmpdir(), ".fonts");
- /**
- * Sets the headless mode.
- * `true` means the 'old' (legacy, chromium < 112) headless mode.
- * "new" means the 'new' headless mode.
- * https://developer.chrome.com/articles/new-headless/#try-out-the-new-headless
- * @default "new"
- */
- public static set setHeadlessMode(value: true | "new") {
- if (
- (typeof value === "string" && value !== "new") ||
- (typeof value === "boolean" && value !== true)
- ) {
- throw new Error(
- `Headless mode must be either \`true\` or 'new', you entered '${value}'`
- );
+ // Create fonts directory if it doesn't exist
+ if (!existsSync(fontsDir)) {
+ mkdirSync(fontsDir);
}
- this.headlessMode = value;
- }
- /**
- * Returns whether the graphics stack is enabled or disabled
- * @returns boolean
- */
- public static get graphics() {
- return this.graphicsMode;
- }
+ // Convert local path to file URL if needed
+ if (!/^https?:\/\//i.test(input)) {
+ input = `file://${input}`;
+ }
- /**
- * Sets whether the graphics stack is enabled or disabled.
- * @param true means the stack is enabled. WebGL will work.
- * @param false means that the stack is disabled. WebGL will not work.
- * `false` will also skip the extract of the graphics driver, saving about a second during initial extract
- * @default true
- */
- public static set setGraphicsMode(value: boolean) {
- if (typeof value !== "boolean") {
- throw new Error(
- `Graphics mode must be a boolean, you entered '${value}'`
- );
+ const url = new URL(input);
+ const fontName = url.pathname.split("/").pop();
+
+ if (!fontName) {
+ throw new Error(`Invalid font name: ${url.pathname}`);
+ }
+ const outputPath = `${fontsDir}/${fontName}`;
+
+ // Return font name if it already exists
+ if (existsSync(outputPath)) {
+ return fontName;
+ }
+
+ // Handle local file
+ if (url.protocol === "file:") {
+ try {
+ await createSymlink(url.pathname, outputPath);
+ return fontName;
+ } catch (error) {
+ throw new Error(
+ `Failed to create symlink for font: ${JSON.stringify(error)}`
+ );
+ }
+ }
+ // Handle remote file
+ else {
+ try {
+ await downloadFile(input, outputPath);
+ return fontName;
+ } catch (error) {
+ throw new Error(`Failed to download font: ${JSON.stringify(error)}`);
+ }
}
- this.graphicsMode = value;
}
}
-export = Chromium;
+export default Chromium;
diff --git source/lambdafs.ts source/lambdafs.ts
index 57a63f8..0601f7f 100644
--- source/lambdafs.ts
+++ source/lambdafs.ts
@@ -1,75 +1,88 @@
import { createReadStream, createWriteStream, existsSync } from "node:fs";
import { tmpdir } from "node:os";
import { basename, join } from "node:path";
-import { extract } from "tar-fs";
import { createBrotliDecompress, createUnzip } from "node:zlib";
+import { extract } from "tar-fs";
-class LambdaFS {
- /**
- * Decompresses a (tarballed) Brotli or Gzip compressed file and returns the path to the decompressed file/folder.
- *
- * @param filePath Path of the file to decompress.
- */
- static inflate(filePath: string): Promise<string> {
- const output = filePath.includes("swiftshader")
- ? tmpdir()
- : join(
- tmpdir(),
- basename(filePath).replace(
- /[.](?:t(?:ar(?:[.](?:br|gz))?|br|gz)|br|gz)$/i,
- ""
- )
- );
+/**
+ * Decompresses a (tarballed) Brotli or Gzip compressed file and returns the path to the decompressed file/folder.
+ *
+ * @param filePath Path of the file to decompress.
+ */
+export const inflate = (filePath: string): Promise<string> => {
+ // Determine the output path based on the file type
+ const output = filePath.includes("swiftshader")
+ ? tmpdir()
+ : join(
+ tmpdir(),
+ basename(filePath).replace(
+ /\.(?:t(?:ar(?:\.(?:br|gz))?|br|gz)|br|gz)$/i,
+ ""
+ )
+ );
- return new Promise((resolve, reject) => {
- if (filePath.includes("swiftshader")) {
- if (existsSync(`${output}/libGLESv2.so`)) {
- return resolve(output);
- }
- } else {
- if (existsSync(output) === true) {
- return resolve(output);
- }
+ return new Promise((resolve, reject) => {
+ // Quick return if the file is already decompressed
+ if (filePath.includes("swiftshader")) {
+ if (existsSync(`${output}/libGLESv2.so`)) {
+ resolve(output);
+ return;
}
+ } else if (existsSync(output)) {
+ resolve(output);
+ return;
+ }
- let source = createReadStream(filePath, { highWaterMark: 2 ** 23 });
- let target = null;
+ // Optimize chunk size based on file type - use smaller chunks for better memory usage
+ // Brotli files tend to decompress to much larger sizes
+ const isBrotli = /br$/i.test(filePath);
+ const isGzip = /gz$/i.test(filePath);
+ const isTar = /\.t(?:ar(?:\.(?:br|gz))?|br|gz)$/i.test(filePath);
- if (/[.](?:t(?:ar(?:[.](?:br|gz))?|br|gz))$/i.test(filePath) === true) {
- target = extract(output);
+ // Use a smaller highWaterMark for better memory efficiency
+ // For most serverless environments, 4MB (2**22) is more memory-efficient than 8MB
+ const highWaterMark = 2 ** 22;
- target.once("finish", () => {
- return resolve(output);
- });
- } else {
- target = createWriteStream(output, { mode: 0o700 });
- }
+ const source = createReadStream(filePath, { highWaterMark });
+ let target;
- source.once("error", (error: Error) => {
- return reject(error);
- });
+ // Setup error handlers first for both streams
+ const handleError = (error: Error) => {
+ reject(error);
+ };
- target.once("error", (error: Error) => {
- return reject(error);
- });
+ source.once("error", handleError);
+ // Setup the appropriate target stream based on file type
+ if (isTar) {
+ target = extract(output);
+ target.once("finish", () => {
+ resolve(output);
+ });
+ } else {
+ target = createWriteStream(output, { mode: 0o700 });
target.once("close", () => {
- return resolve(output);
+ resolve(output);
});
+ }
- if (/(?:br|gz)$/i.test(filePath) === true) {
- source
- .pipe(
- /br$/i.test(filePath)
- ? createBrotliDecompress({ chunkSize: 2 ** 21 })
- : createUnzip({ chunkSize: 2 ** 21 })
- )
- .pipe(target);
- } else {
- source.pipe(target);
- }
- });
- }
-}
+ target.once("error", handleError);
+
+ // Pipe through the appropriate decompressor if needed
+ if (isBrotli || isGzip) {
+ // Use optimized chunk size for decompression
+ // 2MB (2**21) is sufficient for most brotli/gzip files
+ const decompressor = isBrotli
+ ? createBrotliDecompress({ chunkSize: 2 ** 21 })
+ : createUnzip({ chunkSize: 2 ** 21 });
+
+ // Handle decompressor errors
+ decompressor.once("error", handleError);
-export = LambdaFS;
+ // Chain the streams
+ source.pipe(decompressor).pipe(target);
+ } else {
+ source.pipe(target);
+ }
+ });
+};
diff --git a/source/paths.cjs.ts b/source/paths.cjs.ts
new file mode 100644
index 0000000..30f7f69
--- /dev/null
+++ source/paths.cjs.ts
@@ -0,0 +1,9 @@
+import { dirname, join } from "node:path";
+
+/**
+ * Get the bin directory path for CommonJS modules
+ */
+export function getBinPath(): string {
+ // eslint-disable-next-line unicorn/prefer-module
+ return join(dirname(__filename), "..", "..", "bin");
+}
diff --git a/source/paths.esm.ts b/source/paths.esm.ts
new file mode 100644
index 0000000..16f591e
--- /dev/null
+++ source/paths.esm.ts
@@ -0,0 +1,9 @@
+import { dirname, join } from "node:path";
+import { fileURLToPath } from "node:url";
+
+/**
+ * Get the bin directory path for ESM modules
+ */
+export function getBinPath(): string {
+ return join(dirname(fileURLToPath(import.meta.url)), "..", "..", "bin");
+}
diff --git a/tests/chromium.test.ts b/tests/chromium.test.ts
new file mode 100644
index 0000000..1ce2415
--- /dev/null
+++ tests/chromium.test.ts
@@ -0,0 +1,449 @@
+import { execSync } from "node:child_process";
+import { createHash } from "node:crypto";
+import {
+ existsSync,
+ lstatSync,
+ readFileSync,
+ readlinkSync,
+ rmSync,
+ unlinkSync,
+ writeFileSync,
+} from "node:fs";
+import { tmpdir } from "node:os";
+import { join } from "node:path";
+import puppeteer from "puppeteer-core";
+import {
+ afterAll,
+ afterEach,
+ beforeAll,
+ beforeEach,
+ describe,
+ expect,
+ it,
+} from "vitest";
+
+import {
+ createSymlink,
+ downloadAndExtract,
+ downloadFile,
+ isRunningInAmazonLinux2023,
+ isValidUrl,
+ setupLambdaEnvironment,
+} from "../source/helper.js";
+import chromium from "../source/index.js";
+import { inflate } from "../source/lambdafs.js";
+import { getBinPath } from "../source/paths.esm.js";
+
+describe("Helper", () => {
+ // Save original environment and restore after each test
+ const originalEnv = process.env;
+
+ beforeEach(() => {
+ // Clone environment to avoid test pollution
+ process.env = { ...originalEnv };
+ });
+
+ afterEach(() => {
+ // Restore the original environment
+ process.env = originalEnv;
+ });
+
+ describe("createSymlink", () => {
+ it("should create a symlink when the source file exists", async () => {
+ // Setup: create a temp file
+ const tempDir = tmpdir();
+ const sourceFile = join(tempDir, `source_${Date.now().toFixed(0)}.txt`);
+ const targetLink = join(tempDir, `target_${Date.now().toFixed(0)}.txt`);
+ writeFileSync(sourceFile, "test content");
+
+ try {
+ // Execute
+ await createSymlink(sourceFile, targetLink);
+
+ // Verify: targetLink exists and is a symlink
+ const stat = lstatSync(targetLink);
+ expect(stat.isSymbolicLink()).toBe(true);
+
+ // Verify: the symlink points to the correct file
+ const linkTarget = readlinkSync(targetLink);
+ expect(linkTarget).toBe(sourceFile);
+
+ // Verify: reading the symlink gives the original content
+ const content = readFileSync(targetLink, "utf8");
+ expect(content).toBe("test content");
+ } finally {
+ // Cleanup
+ try {
+ unlinkSync(sourceFile);
+ unlinkSync(targetLink);
+ } catch (error) {
+ // Ignore errors during cleanup
+ console.error("Cleanup error:", error);
+ }
+ }
+ });
+
+ it("should reject if the source file does not exist", async () => {
+ const tempDir = tmpdir();
+ const sourceFile = join(
+ tempDir,
+ `nonexistent_${Date.now().toFixed(0)}.txt`
+ );
+ const targetLink = join(tempDir, `target_${Date.now().toFixed(0)}.txt`);
+
+ // Execute & Verify
+ await expect(
+ createSymlink(sourceFile, targetLink)
+ ).rejects.toBeInstanceOf(Error);
+
+ // Cleanup: ensure no symlink was created
+ expect(existsSync(targetLink)).toBe(false);
+ });
+ });
+
+ describe("downloadFile", () => {
+ it("should download a file successfully", async () => {
+ const url = "https://www.example.com/index.html";
+ const tempDir = tmpdir();
+ const destPath = join(
+ tempDir,
+ `download_test_${Date.now().toFixed(0)}.txt`
+ );
+
+ try {
+ await downloadFile(url, destPath);
+ expect(existsSync(destPath)).toBe(true);
+ const content = readFileSync(destPath, "utf8");
+ expect(content).toBeTruthy();
+ } finally {
+ try {
+ unlinkSync(destPath);
+ } catch (error) {
+ // Ignore errors during cleanup
+ console.error("Cleanup error:", error);
+ }
+ }
+ });
+
+ it("should reject when status code is not 200", async () => {
+ // Execute & Verify
+ await expect(
+ // eslint-disable-next-line sonarjs/publicly-writable-directories
+ downloadFile("https://example.com/file.zip", "/tmp/file.zip")
+ ).rejects.toStrictEqual(new Error("Unexpected status code: 404."));
+ });
+ });
+
+ describe("setupLambdaEnvironment", () => {
+ it("should set FONTCONFIG_PATH if not defined", () => {
+ delete process.env["FONTCONFIG_PATH"];
+ setupLambdaEnvironment("/lib/path");
+ // eslint-disable-next-line sonarjs/publicly-writable-directories
+ expect(process.env["FONTCONFIG_PATH"]).toBe("/tmp/fonts");
+ });
+
+ it("should not override FONTCONFIG_PATH if already defined", () => {
+ process.env["FONTCONFIG_PATH"] = "/custom/fonts";
+ setupLambdaEnvironment("/lib/path");
+ expect(process.env["FONTCONFIG_PATH"]).toBe("/custom/fonts");
+ });
+
+ it("should set HOME if not defined", () => {
+ delete process.env["HOME"];
+ setupLambdaEnvironment("/lib/path");
+ expect(process.env["HOME"]).toBe("/tmp");
+ });
+
+ it("should not override HOME if already defined", () => {
+ process.env["HOME"] = "/custom/home";
+ setupLambdaEnvironment("/lib/path");
+ expect(process.env["HOME"]).toBe("/custom/home");
+ });
+
+ it("should set LD_LIBRARY_PATH if not defined", () => {
+ delete process.env["LD_LIBRARY_PATH"];
+ setupLambdaEnvironment("/lib/path");
+ expect(process.env["LD_LIBRARY_PATH"]).toBe("/lib/path");
+ });
+
+ it("should prepend baseLibPath to LD_LIBRARY_PATH if not already included", () => {
+ process.env["LD_LIBRARY_PATH"] = "/usr/lib:/usr/local/lib";
+ setupLambdaEnvironment("/lib/path");
+ expect(process.env["LD_LIBRARY_PATH"]).toBe(
+ "/lib/path:/usr/lib:/usr/local/lib"
+ );
+ });
+
+ it("should not modify LD_LIBRARY_PATH if baseLibPath is already at the start", () => {
+ process.env["LD_LIBRARY_PATH"] = "/lib/path:/usr/lib";
+ setupLambdaEnvironment("/lib/path");
+ expect(process.env["LD_LIBRARY_PATH"]).toBe("/lib/path:/usr/lib");
+ });
+ });
+
+ describe("isValidUrl", () => {
+ it("should return true for valid URLs", () => {
+ expect(isValidUrl("https://example.com")).toBe(true);
+ expect(isValidUrl("http://localhost:3000")).toBe(true);
+ expect(isValidUrl("ftp://ftp.example.com")).toBe(true);
+ });
+
+ it("should return false for invalid URLs", () => {
+ expect(isValidUrl("not-a-url")).toBe(false);
+ expect(isValidUrl("http://")).toBe(false);
+ expect(isValidUrl("")).toBe(false);
+ });
+ });
+
+ describe("isRunningInAwsLambdaNode20", () => {
+ it("should return true for AWS Lambda Node.js 20 environment", () => {
+ process.env["AWS_EXECUTION_ENV"] = "AWS_Lambda_nodejs20.x";
+ expect(isRunningInAmazonLinux2023(20)).toBe(true);
+ });
+
+ it("should return true for AWS Lambda Node.js 22 environment", () => {
+ process.env["AWS_EXECUTION_ENV"] = "AWS_Lambda_nodejs22.x";
+ expect(isRunningInAmazonLinux2023(22)).toBe(true);
+ });
+
+ it("should return true for AWS Lambda Node.js 24 environment", () => {
+ process.env["AWS_EXECUTION_ENV"] = "AWS_Lambda_nodejs24.x";
+ expect(isRunningInAmazonLinux2023(24)).toBe(true);
+ });
+
+ it("should return true for AWS Lambda JS Runtime Node.js 20 environment", () => {
+ delete process.env["AWS_EXECUTION_ENV"];
+ process.env["AWS_LAMBDA_JS_RUNTIME"] = "nodejs20.x";
+ expect(isRunningInAmazonLinux2023(20)).toBe(true);
+ });
+
+ it("should return true for CodeBuild with Node.js 20", () => {
+ delete process.env["AWS_EXECUTION_ENV"];
+ delete process.env["AWS_LAMBDA_JS_RUNTIME"];
+ process.env["CODEBUILD_BUILD_IMAGE"] =
+ "aws/codebuild/amazonlinux2-x86_64-standard:4.0-nodejs20";
+ expect(isRunningInAmazonLinux2023(20)).toBe(true);
+ });
+
+ it("should return true for Vercel with Node.js 20", () => {
+ delete process.env["AWS_EXECUTION_ENV"];
+ delete process.env["AWS_LAMBDA_JS_RUNTIME"];
+ delete process.env["CODEBUILD_BUILD_IMAGE"];
+ process.env["VERCEL"] = "1";
+ expect(isRunningInAmazonLinux2023(20)).toBe(true);
+ });
+
+ it("should return false for Node.js 18 AWS Lambda environment", () => {
+ process.env["AWS_EXECUTION_ENV"] = "AWS_Lambda_nodejs18.x";
+ expect(isRunningInAmazonLinux2023(18)).toBe(false);
+ });
+
+ it("should return false for non-Lambda environments", () => {
+ delete process.env["AWS_EXECUTION_ENV"];
+ delete process.env["AWS_LAMBDA_JS_RUNTIME"];
+ delete process.env["CODEBUILD_BUILD_IMAGE"];
+ delete process.env["VERCEL"];
+ expect(isRunningInAmazonLinux2023(20)).toBe(false);
+ });
+ });
+
+ describe("downloadAndExtract and lambdafs", () => {
+ const extractDir = join(tmpdir(), "chromium-pack"); // downloadAndExtract extracts to /tmp
+
+ // Clean up known files before test (optional, for idempotency)
+ const expectedFiles = ["aws.tar.br", "chromium.br", "swiftshader.tar.br"];
+
+ const extractedFiles = [
+ "aws",
+ "chromium-pack",
+ "chromium",
+ "lebEGL.so",
+ "libGLESv2.so",
+ "libvk_swiftshader.so",
+ "libvulkan.so.1",
+ "vk_swiftshader_icd.json",
+ ];
+
+ for (const file of extractedFiles) {
+ const filePath = join(extractDir, file);
+ if (existsSync(filePath)) {
+ try {
+ rmSync(filePath, { force: true, recursive: true });
+ } catch (error) {
+ // Ignore errors during cleanup
+ console.error("Cleanup error:", error);
+ }
+ }
+ }
+
+ it(
+ "should download and extract files successfully",
+ { timeout: 60 * 1000 },
+ async () => {
+ const url =
+ "https://github.com/Sparticuz/chromium/releases/download/v109.0.6/chromium-v109.0.6-pack.tar";
+
+ await downloadAndExtract(url);
+
+ // Check that expected files exist
+ for (const file of expectedFiles) {
+ const filePath = join(extractDir, file);
+ expect(existsSync(filePath)).toBe(true);
+ }
+ }
+ );
+
+ it("should extract a .tar file using lambdafs inflate and verify contents", async () => {
+ for (const file of expectedFiles) {
+ const filePath = join(extractDir, file);
+
+ await inflate(filePath);
+
+ // Check that the file was extracted successfully
+ if (filePath.includes("swiftshader")) {
+ expect(existsSync(join(tmpdir(), "libGLESv2.so"))).toBe(true);
+ } else if (filePath.includes("aws")) {
+ expect(existsSync(join(tmpdir(), "aws", "fonts.conf"))).toBe(true);
+ } else if (filePath.includes("chromium")) {
+ expect(existsSync(join(tmpdir(), "chromium"))).toBe(true);
+ }
+ }
+ });
+ });
+
+ afterAll(() => {
+ // Clean up the tmpdir
+ for (const file of [
+ "aws",
+ "chromium-pack",
+ "chromium",
+ "file.zip",
+ "libEGL.so",
+ "libGLESv2.so",
+ "libvk_swiftshader.so",
+ "libvulkan.so.1",
+ "vk_swiftshader_icd.json",
+ ]) {
+ rmSync(join(tmpdir(), file), { force: true, recursive: true });
+ }
+ });
+});
+
+describe("Paths", () => {
+ it("should return the correct bin path for modules", () => {
+ // This test isn't doing any testing
+ const binPath = getBinPath();
+ console.log("Bin Path", binPath);
+ expect(getBinPath()).toBe(binPath);
+ });
+});
+
+describe("Integration", () => {
+ let browser: puppeteer.Browser;
+
+ /**
+ * Setup FONTCONFIG_PATH for non-lambda environments
+ * This is needed for the fontconfig library to find the fonts
+ */
+ beforeAll(() => {
+ process.env["FONTCONFIG_PATH"] = join(tmpdir(), "fonts");
+ });
+
+ it("should open a Chromium window", async () => {
+ const args = puppeteer.defaultArgs({
+ args: chromium.args,
+ headless: "shell",
+ });
+ console.log("Args", args);
+ // Force the setup of Lambda environment
+ setupLambdaEnvironment(join(tmpdir(), "al2023", "lib"));
+ await inflate(join("bin", "al2023.tar.br"));
+ // Console log the contents of /tmp
+
+ browser = await puppeteer.launch({
+ args: args,
+ defaultViewport: {
+ deviceScaleFactor: 1,
+ hasTouch: false,
+ height: 1080,
+ isLandscape: true,
+ isMobile: false,
+ width: 1920,
+ },
+ executablePath: await chromium.executablePath("./bin/"),
+ headless: "shell",
+ });
+ try {
+ const tmpContents = execSync("/usr/bin/ls -la /tmp").toString();
+ console.log("Contents of /tmp directory:");
+ console.log(tmpContents);
+ } catch (error) {
+ console.error("Error listing /tmp directory:", error);
+ }
+ console.log("Browser", browser.connected, process.env["LD_LIBRARY_PATH"]);
+ const version = await browser.version();
+ expect(browser.connected).toBe(true);
+ expect(version).toContain("HeadlessChrome");
+ });
+
+ it("should open a new page", async () => {
+ const page = await browser.newPage();
+ await page.goto("https://example.com");
+ const title = await page.title();
+ expect(title).toBe("Example Domain");
+ });
+
+ it("should take a screenshot", async () => {
+ const page = await browser.newPage();
+ await page.goto("https://example.com", { waitUntil: "networkidle0" });
+ const screenshot = Buffer.from(await page.screenshot());
+ const base64Screenshot = `data:image/png;base64,${screenshot.toString(
+ "base64"
+ )}`;
+ // console.log(base64Screenshot);
+ const hash = createHash("sha256").update(base64Screenshot).digest("hex");
+ expect(hash).toBe(
+ "5b4042aa3f20574b0b408e4c22d65255004d7d2ac1f69e96021649570c74bb36"
+ );
+ });
+
+ it("should take a screenshot of get.webgl.org without the logo", async () => {
+ const page = await browser.newPage();
+ await page.goto("https://get.webgl.org", { waitUntil: "networkidle0" });
+ await page.evaluate(() => {
+ const el = document.querySelector("#logo-container");
+ if (el) el.remove();
+ });
+ const screenshot = Buffer.from(await page.screenshot());
+ const base64Screenshot = `data:image/png;base64,${screenshot.toString(
+ "base64"
+ )}`;
+ // console.log(base64Screenshot);
+ const hash = createHash("sha256").update(base64Screenshot).digest("hex");
+ expect(hash).toBe(
+ "1023e4f59fddb99d184847ca3711e79c06c04587aa7eacbf4ad6e97c7f52125d"
+ );
+ });
+
+ afterAll(async () => {
+ // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition
+ if (browser) {
+ console.log("Closing browser");
+ await browser.close();
+ }
+
+ // Clean up the tmpdir
+ for (const file of [
+ "al2023",
+ "chromium",
+ "fonts",
+ "libEGL.so",
+ "libGLESv2.so",
+ "libvk_swiftshader.so",
+ "libvulkan.so.1",
+ "vk_swiftshader_icd.json",
+ ]) {
+ rmSync(join(tmpdir(), file), { force: true, recursive: true });
+ }
+ });
+});
diff --git a/tools/build.sh b/tools/build.sh
new file mode 100755
index 0000000..1e92c21
--- /dev/null
+++ tools/build.sh
@@ -0,0 +1,54 @@
+# build.sh
+#!/bin/bash
+set -e # exit immediately if error
+
+echo "🧹 Cleaning build directory..."
+rm -rf build/**
+
+echo "📦 Building ESM..."
+cp package.json package.json.orig
+cp tsconfig.build.json tsconfig.build.json.orig
+jq '.exclude += ["./source/paths.cjs.ts"]' tsconfig.build.json > tsconfig.build.json.tmp && mv tsconfig.build.json.tmp tsconfig.build.json
+npx tsc -p tsconfig.build.json --outDir build/esm
+# Restore original package.json and tsconfig.build.json
+cp package.json.orig package.json
+cp tsconfig.build.json.orig tsconfig.build.json
+
+echo "📦 Building CommonJS..."
+# Update package.json and tsconfig for CommonJS build
+# Use jq to modify package.json and tsconfig.build.json
+# Ensure jq is installed: sudo apt install jq
+jq '.type = "commonjs"' package.json > package.json.tmp && mv package.json.tmp package.json
+# Replace ESM path import with CJS path import
+find source -name "*.ts" -exec sed -i 's/paths\.esm/paths.cjs/g' {} \;
+jq '.exclude += ["./source/paths.esm.ts"]' tsconfig.build.json > tsconfig.build.json.tmp && mv tsconfig.build.json.tmp tsconfig.build.json
+npx tsc -p tsconfig.build.json --outDir build/cjs
+# Rename .js files to .cjs in the CJS build
+find build/cjs -name "*.js" -exec sh -c 'mv "$1" "${1%.js}.cjs"' _ {} \;
+# Replace .js imports with .cjs imports in source files
+find build/cjs -name "*.cjs" -exec sed -i 's/\.js"/\.cjs"/g' {} \;
+# Replace "exports.default = Chromium;" with "module.exports = Chromium;" in CJS build files
+find build/cjs -name "*.cjs" -exec sed -i 's/exports\.default = \(.*\);/module.exports = \1;/g' {} \;
+
+# Restore original package.json and tsconfig.build.json
+mv package.json.orig package.json
+mv tsconfig.build.json.orig tsconfig.build.json
+# Undo the source change
+find source -name "*.ts" -exec sed -i 's/paths\.cjs/paths.esm/g' {} \;
+
+echo "📋 Creating package.json files for better module resolution..."
+# Create package.json for CJS directory
+cat > build/cjs/package.json << 'EOF'
+{
+ "type": "commonjs"
+}
+EOF
+
+# Create package.json for ESM directory
+cat > build/esm/package.json << 'EOF'
+{
+ "type": "module"
+}
+EOF
+
+echo '✅ Dual package build completed successfully!'
diff --git a/tools/download-open-sans.mjs b/tools/download-open-sans.mjs
new file mode 100644
index 0000000..dce4c4b
--- /dev/null
+++ tools/download-open-sans.mjs
@@ -0,0 +1,45 @@
+import { exec } from "node:child_process";
+import { mkdir, writeFile } from "node:fs/promises";
+import { join } from "node:path";
+import { promisify } from "node:util";
+
+const FONT_URL = [
+ "https://raw.githubusercontent.com/googlefonts/opensans/main/fonts/ttf/OpenSans-Bold.ttf",
+ "https://raw.githubusercontent.com/googlefonts/opensans/main/fonts/ttf/OpenSans-Italic.ttf",
+ "https://raw.githubusercontent.com/googlefonts/opensans/main/fonts/ttf/OpenSans-Regular.ttf",
+];
+const FONTS_DIR = join("fonts", "fonts", "Open_Sans");
+
+console.log(FONTS_DIR);
+
+async function downloadFonts() {
+ await mkdir(FONTS_DIR, { recursive: true });
+
+ for (const font of FONT_URL) {
+ // eslint-disable-next-line n/no-unsupported-features/node-builtins
+ const res = await fetch(font);
+ if (!res.ok) {
+ throw new Error(`Failed to download font: ${res.status}`);
+ }
+ const arrayBuffer = await res.arrayBuffer();
+ const fontFileName = font.split("/").pop();
+ const FONT_PATH = join(FONTS_DIR, fontFileName ?? "OpenSans.ttf");
+ await writeFile(FONT_PATH, Buffer.from(arrayBuffer));
+ }
+}
+
+await downloadFonts();
+
+console.log("Fonts downloaded successfully.");
+
+const execAsync = promisify(exec);
+
+const tarFile = join("bin", "fonts.tar");
+await execAsync(`tar -cf ${tarFile} -C fonts fonts.conf fonts`);
+
+console.log(`Fonts folder archived to ${tarFile}`);
+
+const brotliFile = `${tarFile}.br`;
+await execAsync(`brotli --best --force --rm --output=${brotliFile} ${tarFile}`);
+
+console.log(`Tar file compressed to ${brotliFile}`);
diff --git a/tools/update-browser-revision.mjs b/tools/update-browser-revision.mjs
new file mode 100644
index 0000000..dbc7b2b
--- /dev/null
+++ tools/update-browser-revision.mjs
@@ -0,0 +1,55 @@
+/**
+ * This file will update the chromium revision in inventory.ini
+ * based on the current stable version of chromium
+ */
+
+import { readFile, writeFile } from "node:fs/promises";
+import { dirname, resolve } from "node:path";
+import { fileURLToPath } from "node:url";
+
+const __filename = fileURLToPath(import.meta.url);
+const __dirname = dirname(__filename);
+
+async function updateDevToolsProtocolVersion() {
+ // eslint-disable-next-line n/no-unsupported-features/node-builtins
+ const result = await fetch(
+ "https://googlechromelabs.github.io/chrome-for-testing/last-known-good-versions.json"
+ );
+ const { channels } = await result.json();
+
+ return channels["Stable"];
+}
+
+/**
+ * Updates the chromium_revision in the specified inventory file.
+ * @param {string} filePath - The path to the inventory.ini file.
+ * @param {string} newRevision - The new revision number.
+ */
+async function updateInventoryFile(filePath, newRevision) {
+ try {
+ const data = await readFile(filePath, "utf8");
+ const updatedData = data.replace(
+ /^(chromium_revision=).*$/m,
+ `$1${newRevision}`
+ );
+ await writeFile(filePath, updatedData, "utf8");
+ console.log(
+ `Successfully updated ${filePath} with revision ${newRevision}`
+ );
+ } catch (error) {
+ console.error(`Error updating inventory file: ${error}`);
+ }
+}
+
+/**
+ * The Command block
+ */
+const stableChannelInfo = await updateDevToolsProtocolVersion();
+const { revision, version } = stableChannelInfo;
+
+console.log(
+ `Fetched stable Chromium revision for Chromium ${version}: ${revision}`
+);
+
+const inventoryPath = resolve(__dirname, "../_/ansible/inventory.ini");
+await updateInventoryFile(inventoryPath, revision);
diff --git a/tsconfig.build.json b/tsconfig.build.json
new file mode 100644
index 0000000..02dd744
--- /dev/null
+++ tsconfig.build.json
@@ -0,0 +1,8 @@
+{
+ "extends": "./tsconfig.json",
+ "compilerOptions": {
+ "declaration": true,
+ "rootDir": "./source"
+ },
+ "include": ["source"]
+}
diff --git tsconfig.json tsconfig.json
index 3fd39c0..d79eb9d 100644
--- tsconfig.json
+++ tsconfig.json
@@ -1,10 +1,19 @@
{
- "extends": ["@tsconfig/node16/tsconfig", "@tsconfig/strictest"],
+ "extends": ["@tsconfig/node20/tsconfig", "@tsconfig/strictest"],
"compilerOptions": {
"declaration": true,
- "lib": ["dom", "ES2021"],
- "module": "commonjs",
- "outDir": "build"
+ "lib": ["dom", "ES2023"],
+ "module": "NodeNext",
+ "moduleResolution": "NodeNext",
+ "outDir": "build",
+ "rootDir": "."
},
- "include": ["source"]
+ "include": [
+ "source",
+ "tests",
+ "vitest.config.ts",
+ "eslint.config.js",
+ "tools",
+ "_/amazon/handlers"
+ ]
}
diff --git a/vitest.config.ts b/vitest.config.ts
new file mode 100644
index 0000000..d2c04c7
--- /dev/null
+++ vitest.config.ts
@@ -0,0 +1,19 @@
+import { defineConfig } from "vitest/config";
+
+export default defineConfig({
+ test: {
+ coverage: {
+ exclude: [
+ "update-browser-revision.mjs",
+ "build",
+ "docker",
+ "examples",
+ "vitest.config.ts",
+ "_",
+ "eslint.config.js",
+ ],
+ reporter: ["json", "json-summary", "text"],
+ reportOnFailure: true,
+ },
+ },
+});
DescriptionThis is a major release PR for Possible Issues
Security Hotspots
ChangesChangesArchitecture Support (ARM64)
Build System
CI/CD
Source Code
Tests & Tooling
Documentation & Examples
sequenceDiagram
participant User
participant Chromium as @sparticuz/chromium
participant Helper as helper.ts
participant LambdaFS as lambdafs.ts
participant FS as File System
participant Net as Network
User->>Chromium: import chromium
Chromium->>Helper: isRunningInAmazonLinux2023(nodeMajorVersion)
Helper-->>Chromium: true/false
alt Running in AL2023
Chromium->>Helper: setupLambdaEnvironment(baseLibPath)
Helper->>Helper: Set FONTCONFIG_PATH, HOME, LD_LIBRARY_PATH
end
User->>Chromium: chromium.executablePath(input?)
Chromium->>FS: existsSync(/tmp/chromium)
alt Already extracted
Chromium-->>User: /tmp/chromium
else Not extracted
alt Input is URL
Chromium->>Helper: downloadAndExtract(url)
Helper->>Net: HTTPS GET (follow-redirects)
Net-->>Helper: tar stream
Helper->>FS: extract to /tmp/chromium-pack
Helper-->>Chromium: /tmp/chromium-pack
end
Chromium->>LambdaFS: inflate(chromium.br)
Chromium->>LambdaFS: inflate(fonts.tar.br)
Chromium->>LambdaFS: inflate(swiftshader.tar.br)
opt AL2023 environment
Chromium->>LambdaFS: inflate(al2023.tar.br)
end
LambdaFS->>FS: createReadStream → brotliDecompress → write/extract
LambdaFS-->>Chromium: /tmp/chromium
Chromium-->>User: /tmp/chromium
end
User->>User: puppeteer.launch({executablePath, args: chromium.args})
|
220fed7 to
c6296f2
Compare
This PR contains the following updates:
^117.0.0→^143.0.0Release Notes
Sparticuz/chromium (@sparticuz/chromium)
v143.0.4Compare Source
@sparticuz/chromium v143.0.4, @sparticuz/chromium-min v143.0.4
The
chromium-v143.0.4-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v143.0.4-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
Full Changelog: Sparticuz/chromium@v143.0.0...v143.0.4
v143.0.3Compare Source
v143.0.2Compare Source
v143.0.0Compare Source
@sparticuz/chromium v143.0.0, @sparticuz/chromium-min v143.0.0
The
chromium-v143.0.0-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v143.0.0-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
What's Changed
New Contributors
Full Changelog: Sparticuz/chromium@v141.0.0...v143.0.0
v141.0.0Compare Source
@sparticuz/chromium v141.0.0, @sparticuz/chromium-min v141.0.0
The
chromium-v141.0.0-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v141.0.0-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
What's Changed
New Contributors
Full Changelog: Sparticuz/chromium@v140.0.0...v141.0.0
v140.0.0Compare Source
@sparticuz/chromium v140.0.0, @sparticuz/chromium-min v140.0.0
The
chromium-v140.0.0-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v140.0.0-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
What's Changed
Full Changelog: Sparticuz/chromium@v138.0.2...v140.0.0
v138.0.2Compare Source
@sparticuz/chromium v138.0.2, @sparticuz/chromium-min v138.0.2
The
chromium-v138.0.2-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v138.0.2-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
Full Changelog: Sparticuz/chromium@v138.0.1...v138.0.2
v138.0.1Compare Source
@sparticuz/chromium v138.0.1, @sparticuz/chromium-min v138.0.1
The
chromium-v138.0.1-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v138.0.1-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
Full Changelog: Sparticuz/chromium@v138.0.0...v138.0.1
v138.0.0Compare Source
@sparticuz/chromium v138.0.0, @sparticuz/chromium-min v138.0.0
Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
The
chromium-v138.0.0-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v138.0.0-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v137.0.1...v138.0.0
v137.0.1Compare Source
@sparticuz/chromium v137.0.1, @sparticuz/chromium-min v137.0.1
The
chromium-v137.0.1-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v137.0.1-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Support this project's continued development by becoming a monthly sponsor on GitHub. Your
contribution helps cover monthly maintenance costs and ensures ongoing improvements.
Full Changelog: Sparticuz/chromium@v137.0.0...v137.0.1
v137.0.0Compare Source
@sparticuz/chromium v137.0.0, @sparticuz/chromium-min v137.0.0
Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
The
chromium-v137.0.0-layer.ARCH.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v137.0.0-pack.ARCH.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Notable Changes
Lots has changed since the last public release of @sparticuz/chromium 133.
Breaking Changes
Removed Args
Removed Opinionated Viewport
The User must now specify the viewport for Puppeteer. The following is a good default.
Removed chromium.headless
The User must now specify the headless type for their library.
"shell"for Puppeteertruefor Playwrightarm64 Support
Finally! It's here! A layer and a pack are now built for arm64.
Dependencies
Dependencies (fonts and lib packs) are now built instead of manually curated files.
Summary of the PRs that make this release
Chromium 137 (#359)
x64andarm64builds, making artifact naming and handling explicit for both architectures.archsnow supports bothx64andarm64).Make an arm64 layer also (#353)
release.yml,test-arm.yml, andtest-x64.ymlupdated to handle separate ARM64 and x64 artifacts.bin/x64andbin/arm64directories for clarity and separation.package.json) updated to exclude unnecessary arch-specific files from wrong releases.Major refactor, arm64 support, auto-gen dependencies (#351)
'shell'for Puppeteer and defaults totruefor Playwright.fonts.tar.br,al2023.tar.br) are now auto-generated.Support this project's continued development by becoming a monthly sponsor on GitHub. Your contribution helps cover monthly maintenance costs and ensures ongoing improvements.
What's Changed
errorevent for remote font by @Juneezee in #350New Contributors
Full Changelog: Sparticuz/chromium@v133.0.0...v137.0.0
v133.0.0Compare Source
@sparticuz/chromium v133.0.0, @sparticuz/chromium-min v133.0.0
The
chromium-v133.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v133.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
headlessproperty typing by @hylickipiotr in #338New Contributors
Full Changelog: Sparticuz/chromium@v132.0.0...v133.0.0
v132.0.0Compare Source
@sparticuz/chromium v132.0.0, @sparticuz/chromium-min v132.0.0
The
chromium-v132.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v132.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
chrome-headless-shelldoes not and will not be supporting the 'new' headless mode.Full Changelog: Sparticuz/chromium@v131.0.1...v132.0.0
v131.0.1Compare Source
@sparticuz/chromium v131.0.1, @sparticuz/chromium-min v131.0.1
The
chromium-v131.0.1-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v131.0.1-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v131.0.0...v131.0.1
v131.0.0Compare Source
@sparticuz/chromium v131.0.0, @sparticuz/chromium-min v131.0.0
The
chromium-v131.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v131.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v130.0.0...v131.0.0
v130.0.0Compare Source
@sparticuz/chromium v130.0.0, @sparticuz/chromium-min v130.0.0
The
chromium-v130.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v130.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v129.0.0...v130.0.0
v129.0.0Compare Source
@sparticuz/chromium v129.0.0, @sparticuz/chromium-min v129.0.0
The
chromium-v129.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v129.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v127.0.0...v129.0.0
v127.0.0Compare Source
@sparticuz/chromium v127.0.0, @sparticuz/chromium-min v127.0.0
The
chromium-v127.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v127.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v126.0.0...v127.0.0
v126.0.0Compare Source
@sparticuz/chromium v126.0.0, @sparticuz/chromium-min v126.0.0
The
chromium-v126.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v126.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v123.0.1...v126.0.0
v123.0.1Compare Source
@sparticuz/chromium v123.0.1, @sparticuz/chromium-min v123.0.1
The
chromium-v123.0.1-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v123.0.1-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v123.0.0...v123.0.1
v123.0.0Compare Source
@sparticuz/chromium v123.0.0, @sparticuz/chromium-min v123.0.0
The
chromium-v123.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v123.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v122.0.0...v123.0.0
v122.0.0Compare Source
@sparticuz/chromium v122.0.0, @sparticuz/chromium-min v122.0.0
The
chromium-v122.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v122.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.Breaking Changes
headlessfield values, should now be set tochrome-headless-shellfor legacy use (Which is what @sparticuz/chromium is using).trueis reserved for future use.What's Changed
Full Changelog: Sparticuz/chromium@v121.0.0...v122.0.0
v121.0.0Compare Source
@sparticuz/chromium v121.0.0, @sparticuz/chromium-min v121.0.0
The
chromium-v121.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v121.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
New Contributors
Full Changelog: Sparticuz/chromium@v119.0.2...v121.0.0
v119.0.2Compare Source
@sparticuz/chromium v119.0.2, @sparticuz/chromium-min v119.0.2
The
chromium-v119.0.2-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v119.0.2-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Special Thanks
Full Changelog: Sparticuz/chromium@v119.0.0...v119.0.2
v119.0.0Compare Source
@sparticuz/chromium v119.0.0, @sparticuz/chromium-min v119.0.0
The
chromium-v119.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v119.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
LayerVersionexample by @davidjb in #178New Contributors
Full Changelog: Sparticuz/chromium@v118.0.0...v119.0.0
v118.0.0Compare Source
@sparticuz/chromium v118.0.0, @sparticuz/chromium-min v118.0.0
The
chromium-v118.0.0-layer.zipfile may be uploaded directly as a layer in AWS Lambda using the following codeThe
chromium-v118.0.0-pack.tarfile may be uploaded to any https endpoint and the remote location may be used as theinputvariable in thechromium.executablePath(input)function.What's Changed
Full Changelog: Sparticuz/chromium@v117.0.0...v118.0.0
Configuration
📅 Schedule: (UTC)
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.