Skip to content

feat(datasource/deb): Support deb indices compression #35865

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 13 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 9 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions lib/modules/datasource/deb/__fixtures__/InRelease2
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dc79555ac96e9efa6b17ef2c3d382b0ec25755706798a0cf3e763e49dadceb53
557 changes: 557 additions & 0 deletions lib/modules/datasource/deb/__fixtures__/InReleaseBookworm

Large diffs are not rendered by default.

17 changes: 17 additions & 0 deletions lib/modules/datasource/deb/__fixtures__/InReleaseInvalid
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
Origin: Debian
Label: Debian
Suite: stable
Version: 12.10
Codename: bookworm
Changelogs: https://metadata.ftp-master.debian.org/changelogs/@CHANGEPATH@_changelog
Date: Sat, 15 Mar 2025 09:09:36 UTC
Acquire-By-Hash: yes
No-Support-for-Architecture-all: Packages
Architectures: all amd64 arm64 armel armhf i386 mips64el mipsel ppc64el s390x
Components: main contrib non-free-firmware non-free
Description: Debian 12.10 Released 15 March 2025
MD5Sum:
SHA256:
d0f253340d20cf69d4781b80088b6c2b00b0002e69ca0a50c9197c634bd1fcef 66277 non-free/binary-s390x/Packages.gz non-free/binary-test/Packages.gz
0adc3569f322f7c993a39f471783aba9f91792789e57774ed2a28b3ecbbe0e0c 54196 non-free/binary-s390x/Packages.xz non-free/binary-test/Packages.xz
d357305aec89074729f9c85a7f0c44061c75ed8db3db5e0e73f24a6da27e0879 121 non-free/binary-s390x/Release non-free/binary-test/Packages
Binary file not shown.
Binary file not shown.
1,558 changes: 1,558 additions & 0 deletions lib/modules/datasource/deb/__fixtures__/Release

Large diffs are not rendered by default.

67 changes: 60 additions & 7 deletions lib/modules/datasource/deb/file.spec.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import { copyFile } from 'fs';
import type { DirectoryResult } from 'tmp-promise';
import { dir } from 'tmp-promise';
import upath from 'upath';
Expand All @@ -6,18 +7,24 @@ import { extract } from './file';
import { Fixtures } from '~test/fixtures';
import { fs } from '~test/util';

const fixturePackagesArchivePath = Fixtures.getPath(`Packages.gz`);
const fixturePackagesArchiveGzPath = Fixtures.getPath(`Packages.gz`);
const fixturePackagesArchiveBz2Path = Fixtures.getPath(`Packages.bz2`);
const fixturePackagesArchiveXzPath = Fixtures.getPath(`Packages.xz`);

describe('modules/datasource/deb/file', () => {
let cacheDir: DirectoryResult | null;
let extractedPackageFile: string;
let extractionFolder: string;
let packageArchiveCache: string;

beforeEach(async () => {
cacheDir = await dir({ unsafeCleanup: true });
GlobalConfig.set({ cacheDir: cacheDir.path });

const extractionFolder = await fs.ensureCacheDir('file');
extractedPackageFile = upath.join(extractionFolder, `package.txt`);
extractionFolder = await fs.ensureCacheDir('file');
extractedPackageFile = upath.join(extractionFolder, 'package.txt');

packageArchiveCache = upath.join(extractionFolder, 'Package');
});

afterEach(async () => {
Expand All @@ -26,10 +33,56 @@ describe('modules/datasource/deb/file', () => {
});

describe('extract', () => {
it('should throw error for unsupported compression', async () => {
await expect(
extract(fixturePackagesArchivePath, 'xz', extractedPackageFile),
).rejects.toThrow('Unsupported compression standard');
it('should support xz compression', async () => {
await copyFixtureToCache(
fixturePackagesArchiveXzPath,
packageArchiveCache,
);
await extract(packageArchiveCache, 'xz', extractedPackageFile);
const fileContent = await fs.readCacheFile(extractedPackageFile, 'utf8');
expect(fileContent).toContain('Package:');
});

it('should support gz compression', async () => {
await copyFixtureToCache(
fixturePackagesArchiveGzPath,
packageArchiveCache,
);
await extract(packageArchiveCache, 'gz', extractedPackageFile);
const fileContent = await fs.readCacheFile(extractedPackageFile, 'utf8');
expect(fileContent).toContain('Package:');
});

it('should support bz2 compression', async () => {
await copyFixtureToCache(
fixturePackagesArchiveBz2Path,
packageArchiveCache,
);
await extract(packageArchiveCache, 'bz2', extractedPackageFile);
const fileContent = await fs.readCacheFile(extractedPackageFile, 'utf8');
expect(fileContent).toContain('Package:');
});
});
});

/**
* Copies a fixture file to the cache directory.
*
* @param fixturePath
* @param cachePath
* @returns
*/
function copyFixtureToCache(
fixturePath: string,
cachePath: string,
): Promise<void> {
return new Promise((resolve, reject) => {
copyFile(fixturePath, cachePath, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
}
25 changes: 18 additions & 7 deletions lib/modules/datasource/deb/file.ts
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
import { createUnzip } from 'zlib';
import * as lzma from 'lzma-native';
import unbzip2 from 'unbzip2-stream';
Comment on lines +2 to +3
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are they using any binary node modules we need to take care of for different architecture?

we build images for amd64 and arm64

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to their documentation, they do not require additional binary node modules. While unbzip2-stream is fully in JavaScript, lzma-native "provides pre-built binaries for multiple Node.js versions and all major OS using node-pre-gyp".

import * as fs from '../../../util/fs';

/**
* Extracts the specified compressed file to the output file.
*
* @param compressedFile - The path to the compressed file.
* @param compression - The compression method used (currently only 'gz' is supported).
* @param compression - The compression method used (currently 'gz', 'xz' and 'bzip2' is supported).
* @param outputFile - The path where the extracted content will be stored.
* @throws Will throw an error if the compression method is unknown.
*/
Expand All @@ -14,12 +16,21 @@ export async function extract(
compression: string,
outputFile: string,
): Promise<void> {
if (compression === 'gz') {
const source = fs.createCacheReadStream(compressedFile);
const destination = fs.createCacheWriteStream(outputFile);
await fs.pipeline(source, createUnzip(), destination);
} else {
throw new Error(`Unsupported compression standard '${compression}'`);
const source = fs.createCacheReadStream(compressedFile);
const destination = fs.createCacheWriteStream(outputFile);

switch (compression) {
case 'gz':
await fs.pipeline(source, createUnzip(), destination);
break;
case 'xz':
await fs.pipeline(source, lzma.createDecompressor(), destination);
break;
case 'bz2':
await fs.pipeline(source, unbzip2(), destination);
break;
default:
throw new Error('Unsupported compression standard');
}
}

Expand Down
Loading