Skip to content

Commit

Permalink
Merge pull request #64 from nika-begiashvili/v2
Browse files Browse the repository at this point in the history
Version 2.0.0
  • Loading branch information
nika-begiashvili authored Jan 18, 2024
2 parents ec7a601 + 6097242 commit 7efaeb1
Show file tree
Hide file tree
Showing 64 changed files with 17,339 additions and 5,806 deletions.
33 changes: 16 additions & 17 deletions .eslintrc.js
Original file line number Diff line number Diff line change
@@ -1,18 +1,17 @@
module.exports = {
"env": {
"browser": true,
"es6": true,
"node": true
},
"extends": "eslint:recommended",
"globals": {
"Atomics": "readonly",
"SharedArrayBuffer": "readonly"
},
"parserOptions": {
"ecmaVersion": 2018,
"sourceType": "module"
},
"rules": {
}
};
env: {
browser: true,
es6: true,
node: true,
},
extends: "eslint:recommended",
globals: {
Atomics: "readonly",
SharedArrayBuffer: "readonly",
},
parserOptions: {
ecmaVersion: 2018,
sourceType: "module",
},
rules: {},
};
3 changes: 3 additions & 0 deletions .prettierignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
lib
dist
*.md
3 changes: 2 additions & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
dist: jammy
language: node_js
node_js:
- "node"
- "lts/*"
48 changes: 46 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,20 @@

## Overview

Libarchivejs is a archive tool for browser which can extract various types of compression, it's a port of [libarchive](https://github.com/libarchive/libarchive) to WebAssembly and javascript wrapper to make it easier to use. Since it runs on WebAssembly performance should be near native. Supported formats: **ZIP**, **7-Zip**, **RAR v4**, **RAR v5**, **TAR**. Supported compression: **GZIP**, **DEFLATE**, **BZIP2**, **LZMA**
Libarchivejs is a archive tool for browser which can extract and create various types of compression, it's a port of [libarchive](https://github.com/libarchive/libarchive) to WebAssembly and javascript wrapper to make it easier to use. Since it runs on WebAssembly performance should be near native. Supported formats: **ZIP**, **7-Zip**, **RAR v4**, **RAR v5**, **TAR** .etc, Supported compression: **GZIP**, **DEFLATE**, **BZIP2**, **LZMA** .etc

## Version 2.0 highlights!

* <font size="5">Create archives</font>
* <font size="5">Use it in NodeJS</font>

## How to use

Install with `npm i libarchive.js` and use it as a ES module.

The library consists of two parts: ES module and webworker bundle, ES module part is your interface to talk to library, use it like any other module. The webworker bundle lives in the `libarchive.js/dist` folder so you need to make sure that it is available in your public folder since it will not get bundled if you're using bundler (it's all bundled up already) and specify correct path to `Archive.init()` method
The library consists of two parts: ES module and webworker bundle, ES module part is your interface to talk to library, use it like any other module. The webworker bundle lives in the `libarchive.js/dist` folder so you need to make sure that it is available in your public folder since it will not get bundled if you're using bundler (it's all bundled up already) and specify correct path to `Archive.init()` method

*if libarchive.js file is in the same directory as bundle file than you don't need to call `Archive.init()` at all*

```js
import {Archive} from 'libarchive.js/main.js';
Expand Down Expand Up @@ -117,6 +124,43 @@ To extract a single file from the archive you can use the `extract()` method on
let obj = await archive.extractFiles();
```

### Create new archive

**Note:** pathname is optional in browser but **required** in NodeJS

```js
const archiveFile = await Archive.write({
files: [
{ file: file, pathname: 'folder/file.zip' }
],
outputFileName: "test.tar.gz",
compression: ArchiveCompression.GZIP,
format: ArchiveFormat.USTAR,
passphrase: null,
});

```

### Use it in NodeJS

```js
import { Archive, ArchiveCompression, ArchiveFormat } from "libarchivejs/dist/libarchive-node.mjs";

let buffer = fs.readFileSync("test/files/archives/README.md");
let blob = new Blob([buffer]);

const archiveFile = await Archive.write({
files: [{
file: blob,
pathname: "README.md",
}],
outputFileName: "test.tar.gz",
compression: ArchiveCompression.GZIP,
format: ArchiveFormat.USTAR,
passphrase: null,
});
```

## How it works

Libarchivejs is a port of the popular [libarchive](https://github.com/libarchive/libarchive) C library to WASM. Since WASM runs in the current thread, the library uses WebWorkers for heavy lifting. The ES Module (Archive class) is just a client for WebWorker. It's tiny and doesn't take up much space.
Expand Down
51 changes: 51 additions & 0 deletions dist/build/compiled/archive-reader.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
export type ArchiveEntry = {
size: number;
path: string;
type: string;
lastModified: number;
fileData: ArrayBuffer;
fileName: string;
};
export declare class ArchiveReader {
private file;
private client;
private worker;
private _content;
private _processed;
constructor(file: File, client: any, worker: any);
/**
* Prepares file for reading
* @returns {Promise<Archive>} archive instance
*/
open(): Promise<ArchiveReader>;
/**
* Terminate worker to free up memory
*/
close(): Promise<void>;
/**
* detect if archive has encrypted data
* @returns {boolean|null} null if could not be determined
*/
hasEncryptedData(): Promise<boolean | null>;
/**
* set password to be used when reading archive
*/
usePassword(archivePassword: string): Promise<void>;
/**
* Set locale, defaults to en_US.UTF-8
*/
setLocale(locale: string): Promise<void>;
/**
* Returns object containing directory structure and file information
* @returns {Promise<object>}
*/
getFilesObject(): Promise<any>;
getFilesArray(): Promise<any[]>;
extractSingleFile(target: string): Promise<File>;
/**
* Returns object containing directory structure and extracted File objects
* @param {Function} extractCallback
*
*/
extractFiles(extractCallback?: Function | undefined): Promise<any>;
}
26 changes: 26 additions & 0 deletions dist/build/compiled/compressed-file.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import { ArchiveReader } from "./archive-reader";
/**
* Represents compressed file before extraction
*/
export declare class CompressedFile {
constructor(name: string, size: number, path: string, lastModified: number, archiveRef: ArchiveReader);
private _name;
private _size;
private _path;
private _lastModified;
private _archiveRef;
/**
* File name
*/
get name(): string;
/**
* File size
*/
get size(): number;
get lastModified(): number;
/**
* Extract file from archive
* @returns {Promise<File>} extracted file
*/
extract(): any;
}
49 changes: 49 additions & 0 deletions dist/build/compiled/formats.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
export declare enum ArchiveFormat {
SEVEN_ZIP = "7zip",
AR = "ar",
ARBSD = "arbsd",
ARGNU = "argnu",
ARSVR4 = "arsvr4",
BIN = "bin",
BSDTAR = "bsdtar",
CD9660 = "cd9660",
CPIO = "cpio",
GNUTAR = "gnutar",
ISO = "iso",
ISO9660 = "iso9660",
MTREE = "mtree",
MTREE_CLASSIC = "mtree-classic",
NEWC = "newc",
ODC = "odc",
OLDTAR = "oldtar",
PAX = "pax",
PAXR = "paxr",
POSIX = "posix",
PWB = "pwb",
RAW = "raw",
RPAX = "rpax",
SHAR = "shar",
SHARDUMP = "shardump",
USTAR = "ustar",
V7TAR = "v7tar",
V7 = "v7",
WARC = "warc",
XAR = "xar",
ZIP = "zip"
}
export declare enum ArchiveCompression {
B64ENCODE = "b64encode",
BZIP2 = "bzip2",
COMPRESS = "compress",
GRZIP = "grzip",
GZIP = "gzip",
LRZIP = "lrzip",
LZ4 = "lz4",
LZIP = "lzip",
LZMA = "lzma",
LZOP = "lzop",
UUENCODE = "uuencode",
XZ = "xz",
ZSTD = "zstd",
NONE = "none"
}
1 change: 1 addition & 0 deletions dist/build/compiled/libarchive-browser.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export * from "./libarchive";
1 change: 1 addition & 0 deletions dist/build/compiled/libarchive-node.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export * from "./libarchive";
31 changes: 31 additions & 0 deletions dist/build/compiled/libarchive.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
import { ArchiveCompression, ArchiveFormat } from "./formats.js";
import { ArchiveReader } from "./archive-reader.js";
export { ArchiveCompression, ArchiveFormat } from "./formats.js";
export type ArchiveOptions = {
workerUrl?: string | URL;
getWorker?: Function;
createClient?: (worker: any) => any;
};
export type ArchiveEntryFile = {
file: ArchiveEntryFile;
pathname?: string;
};
export type ArchiveWriteOptions = {
files: ArchiveEntryFile[];
outputFileName: string;
compression: ArchiveCompression;
format: ArchiveFormat;
passphrase: string | null;
};
export declare class Archive {
private static _options;
/**
* Initialize libarchivejs
* @param {Object} options
*/
static init(options?: ArchiveOptions | null): ArchiveOptions;
static open(file: File): Promise<ArchiveReader>;
static write({ files, outputFileName, compression, format, passphrase, }: ArchiveWriteOptions): Promise<File>;
private static getWorker;
private static getClient;
}
3 changes: 3 additions & 0 deletions dist/build/compiled/utils.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
export declare function cloneContent(obj: any): any;
export declare function objectToArray(obj: any, path?: string): any[];
export declare function getObjectPropReference(obj: any, path: string): any[];
12 changes: 12 additions & 0 deletions dist/libarchive-node.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
import{Worker as e}from"worker_threads";import{URL as t}from"url";
/**
* @license
* Copyright 2019 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/const n=Symbol("Comlink.proxy"),r=Symbol("Comlink.endpoint"),s=Symbol("Comlink.releaseProxy"),i=Symbol("Comlink.finalizer"),a=Symbol("Comlink.thrown"),o=e=>"object"==typeof e&&null!==e||"function"==typeof e,c=new Map([["proxy",{canHandle:e=>o(e)&&e[n],serialize(e){const{port1:t,port2:n}=new MessageChannel;return l(e,t),[n,[n]]},deserialize:e=>(e.start(),p(e))}],["throw",{canHandle:e=>o(e)&&a in e,serialize({value:e}){let t;return t=e instanceof Error?{isError:!0,value:{message:e.message,name:e.name,stack:e.stack}}:{isError:!1,value:e},[t,[]]},deserialize(e){if(e.isError)throw Object.assign(new Error(e.value.message),e.value);throw e.value}}]]);function l(e,t=globalThis,n=["*"]){t.addEventListener("message",(function r(s){if(!s||!s.data)return;if(!function(e,t){for(const n of e){if(t===n||"*"===n)return!0;if(n instanceof RegExp&&n.test(t))return!0}return!1}(n,s.origin))return void console.warn(`Invalid origin '${s.origin}' for comlink proxy`);const{id:o,type:c,path:p}=Object.assign({path:[]},s.data),h=(s.data.argumentList||[]).map(b);let f;try{const t=p.slice(0,-1).reduce(((e,t)=>e[t]),e),n=p.reduce(((e,t)=>e[t]),e);switch(c){case"GET":f=n;break;case"SET":t[p.slice(-1)[0]]=b(s.data.value),f=!0;break;case"APPLY":f=n.apply(t,h);break;case"CONSTRUCT":f=y(new n(...h));break;case"ENDPOINT":{const{port1:t,port2:n}=new MessageChannel;l(e,n),f=function(e,t){return v.set(e,t),e}(t,[t])}break;case"RELEASE":f=void 0;break;default:return}}catch(e){f={value:e,[a]:0}}Promise.resolve(f).catch((e=>({value:e,[a]:0}))).then((n=>{const[s,a]=E(n);t.postMessage(Object.assign(Object.assign({},s),{id:o}),a),"RELEASE"===c&&(t.removeEventListener("message",r),u(t),i in e&&"function"==typeof e[i]&&e[i]())})).catch((e=>{const[n,r]=E({value:new TypeError("Unserializable return value"),[a]:0});t.postMessage(Object.assign(Object.assign({},n),{id:o}),r)}))})),t.start&&t.start()}function u(e){(function(e){return"MessagePort"===e.constructor.name})(e)&&e.close()}function p(e,t){return g(e,[],t)}function h(e){if(e)throw new Error("Proxy has been released and is not useable")}function f(e){return R(e,{type:"RELEASE"}).then((()=>{u(e)}))}const d=new WeakMap,m="FinalizationRegistry"in globalThis&&new FinalizationRegistry((e=>{const t=(d.get(e)||0)-1;d.set(e,t),0===t&&f(e)}));function g(e,t=[],n=function(){}){let i=!1;const a=new Proxy(n,{get(n,r){if(h(i),r===s)return()=>{!function(e){m&&m.unregister(e)}(a),f(e),i=!0};if("then"===r){if(0===t.length)return{then:()=>a};const n=R(e,{type:"GET",path:t.map((e=>e.toString()))}).then(b);return n.then.bind(n)}return g(e,[...t,r])},set(n,r,s){h(i);const[a,o]=E(s);return R(e,{type:"SET",path:[...t,r].map((e=>e.toString())),value:a},o).then(b)},apply(n,s,a){h(i);const o=t[t.length-1];if(o===r)return R(e,{type:"ENDPOINT"}).then(b);if("bind"===o)return g(e,t.slice(0,-1));const[c,l]=w(a);return R(e,{type:"APPLY",path:t.map((e=>e.toString())),argumentList:c},l).then(b)},construct(n,r){h(i);const[s,a]=w(r);return R(e,{type:"CONSTRUCT",path:t.map((e=>e.toString())),argumentList:s},a).then(b)}});return function(e,t){const n=(d.get(t)||0)+1;d.set(t,n),m&&m.register(e,t,e)}(a,e),a}function w(e){const t=e.map(E);return[t.map((e=>e[0])),(n=t.map((e=>e[1])),Array.prototype.concat.apply([],n))];var n}const v=new WeakMap;function y(e){return Object.assign(e,{[n]:!0})}function E(e){for(const[t,n]of c)if(n.canHandle(e)){const[r,s]=n.serialize(e);return[{type:"HANDLER",name:t,value:r},s]}return[{type:"RAW",value:e},v.get(e)||[]]}function b(e){switch(e.type){case"HANDLER":return c.get(e.name).deserialize(e.value);case"RAW":return e.value}}function R(e,t,n){return new Promise((r=>{const s=new Array(4).fill(0).map((()=>Math.floor(Math.random()*Number.MAX_SAFE_INTEGER).toString(16))).join("-");e.addEventListener("message",(function t(n){n.data&&n.data.id&&n.data.id===s&&(e.removeEventListener("message",t),r(n.data))})),e.start&&e.start(),e.postMessage(Object.assign({id:s},t),n)}))}class k{constructor(e,t,n,r,s){this._name=e,this._size=t,this._path=n,this._lastModified=r,this._archiveRef=s}get name(){return this._name}get size(){return this._size}get lastModified(){return this._lastModified}extract(){return this._archiveRef.extractSingleFile(this._path)}}function P(e){if(e instanceof File||e instanceof k||null===e)return e;const t={};for(const n of Object.keys(e))t[n]=P(e[n]);return t}function S(e,t=""){const n=[];for(const r of Object.keys(e))e[r]instanceof File||e[r]instanceof k||null===e[r]?n.push({file:e[r]||r,path:t}):n.push(...S(e[r],`${t}${r}/`));return n}function _(e,t){const n=t.split("/");""===n[n.length-1]&&n.pop();let r=e,s=null;for(const e of n)r[e]=r[e]||{},s=r,r=r[e];return[s,n[n.length-1]]}class A{constructor(e,t,n){this._content={},this._processed=0,this.file=e,this.client=t,this.worker=n}open(){return this._content={},this._processed=0,new Promise(((e,t)=>{this.client.open(this.file,y((()=>{e(this)})))}))}async close(){var e;null===(e=this.worker)||void 0===e||e.terminate(),this.worker=null,this.client=null,this.file=null}async hasEncryptedData(){return await this.client.hasEncryptedData()}async usePassword(e){await this.client.usePassword(e)}async setLocale(e){await this.client.setLocale(e)}async getFilesObject(){if(this._processed>0)return Promise.resolve().then((()=>this._content));return(await this.client.listFiles()).forEach((e=>{const[t,n]=_(this._content,e.path);"FILE"===e.type&&(t[n]=new k(e.fileName,e.size,e.path,e.lastModified,this))})),this._processed=1,P(this._content)}getFilesArray(){return this.getFilesObject().then((e=>S(e)))}async extractSingleFile(e){if(null===this.worker)throw new Error("Archive already closed");const t=await this.client.extractSingleFile(e);return new File([t.fileData],t.fileName,{type:"application/octet-stream",lastModified:t.lastModified/1e6})}async extractFiles(e=void 0){var t;if(this._processed>1)return Promise.resolve().then((()=>this._content));return(await this.client.extractFiles()).forEach((t=>{const[n,r]=_(this._content,t.path);"FILE"===t.type&&(n[r]=new File([t.fileData],t.fileName,{type:"application/octet-stream"}),void 0!==e&&setTimeout(e.bind(null,{file:n[r],path:t.path})))})),this._processed=2,null===(t=this.worker)||void 0===t||t.terminate(),P(this._content)}}var L,O;!function(e){e.SEVEN_ZIP="7zip",e.AR="ar",e.ARBSD="arbsd",e.ARGNU="argnu",e.ARSVR4="arsvr4",e.BIN="bin",e.BSDTAR="bsdtar",e.CD9660="cd9660",e.CPIO="cpio",e.GNUTAR="gnutar",e.ISO="iso",e.ISO9660="iso9660",e.MTREE="mtree",e.MTREE_CLASSIC="mtree-classic",e.NEWC="newc",e.ODC="odc",e.OLDTAR="oldtar",e.PAX="pax",e.PAXR="paxr",e.POSIX="posix",e.PWB="pwb",e.RAW="raw",e.RPAX="rpax",e.SHAR="shar",e.SHARDUMP="shardump",e.USTAR="ustar",e.V7TAR="v7tar",e.V7="v7",e.WARC="warc",e.XAR="xar",e.ZIP="zip"}(L||(L={})),function(e){e.B64ENCODE="b64encode",e.BZIP2="bzip2",e.COMPRESS="compress",e.GRZIP="grzip",e.GZIP="gzip",e.LRZIP="lrzip",e.LZ4="lz4",e.LZIP="lzip",e.LZMA="lzma",e.LZOP="lzop",e.UUENCODE="uuencode",e.XZ="xz",e.ZSTD="zstd",e.NONE="none"}(O||(O={}));class z{static init(e=null){return z._options=e||{},z._options}static async open(e){const t=z.getWorker(z._options),n=await z.getClient(t,z._options),r=new A(e,n,t);return await r.open()}static async write({files:e,outputFileName:t,compression:n,format:r,passphrase:s=null}){const i=z.getWorker(z._options),a=await z.getClient(i,z._options),o=await a.writeArchive(e,n,r,s);return i.terminate(),new File([o],t,{type:"application/octet-stream"})}static getWorker(e){return e.getWorker?e.getWorker():new Worker(e.workerUrl||new URL("./worker-bundle.js",import.meta.url),{type:"module"})}static async getClient(e,t){var n;const r=(null===(n=t.createClient)||void 0===n?void 0:n.call(t,e))||p(e);let{promise:s,resolve:i}=Promise.withResolvers();const a=await new r(y((()=>{i()})));return await s,a}}z._options={},Promise.withResolvers||(Promise.withResolvers=function(){var e,t,n=new this((function(n,r){e=n,t=r}));return{resolve:e,reject:t,promise:n}});const C=new t(".",import.meta.url).pathname;z.init({getWorker:()=>new e(`${C}/worker-bundle-node.mjs`),createClient:e=>p(
/**
* @license
* Copyright 2019 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
function(e){const t=new WeakMap;return{postMessage:e.postMessage.bind(e),addEventListener:(n,r)=>{const s=e=>{"handleEvent"in r?r.handleEvent({data:e}):r({data:e})};e.on("message",s),t.set(r,s)},removeEventListener:(n,r)=>{const s=t.get(r);s&&(e.off("message",s),t.delete(r))},start:e.start&&e.start.bind(e)}}(e))});export{z as Archive,O as ArchiveCompression,L as ArchiveFormat};
Loading

0 comments on commit 7efaeb1

Please sign in to comment.