1
0
mirror of https://github.com/S2-/gitlit synced 2025-08-03 12:50:04 +02:00

add node modules to repo

This commit is contained in:
s2
2018-06-03 13:47:11 +02:00
parent e8c95255e8
commit d002126b72
4115 changed files with 440218 additions and 7519 deletions

65
node_modules/asar/CHANGELOG.md generated vendored Normal file
View File

@@ -0,0 +1,65 @@
# Changes By Version
## 0.14.0 - 2017-11-02
### Added
* Snapcraft metadata (#130)
* `uncache` and `uncacheAll` (#118)
### Fixed
* Use of asar inside of an Electron app (#118)
## 0.13.1 - 2017-11-02
### Fixed
- Do not return before the write stream fully closes (#113)
## 0.13.0 - 2017-01-09
### Changed
- Dropped support for Node `0.10.0` and `0.12.0`. The minimum supported version
is now Node `4.6.0`. (#100)
- This project was ported from CoffeeScript to JavaScript. The behavior and
APIs should be the same as previous releases. (#100)
## 0.12.4 - 2016-12-28
### Fixed
- Unpack glob patterns containing `{}` characters not working properly (#99)
## 0.12.3 - 2016-08-29
### Fixed
- Multibyte characters in paths are now supported (#86)
## 0.12.2 - 2016-08-22
### Fixed
- Upgraded `minimatch` to `^3.0.3` from `^3.0.0` for [RegExp DOS fix](https://nodesecurity.io/advisories/minimatch_regular-expression-denial-of-service).
## 0.12.1 - 2016-07-25
### Fixed
- Fix `Maximum call stack size exceeded` error regression (#80)
## 0.12.0 - 2016-07-20
### Added
- Added `transform` option to specify a `stream.Transform` function to the
`createPackageWithOptions` API (#73)
## 0.11.0 - 2016-04-06
### Fixed
- Upgraded `mksnapshot` dependency to remove logged `graceful-fs` deprecation
warnings (#61)

20
node_modules/asar/LICENSE.md generated vendored Normal file
View File

@@ -0,0 +1,20 @@
Copyright (c) 2014 GitHub Inc.
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

193
node_modules/asar/README.md generated vendored Normal file
View File

@@ -0,0 +1,193 @@
# asar - Electron Archive
[![Travis build status](https://travis-ci.org/electron/asar.svg?branch=master)](https://travis-ci.org/electron/asar)
[![AppVeyor build status](https://ci.appveyor.com/api/projects/status/mrfwfr0uxlbwkuq3?svg=true)](https://ci.appveyor.com/project/electron-bot/asar)
[![dependencies](http://img.shields.io/david/electron/asar.svg?style=flat-square)](https://david-dm.org/electron/asar)
[![npm version](http://img.shields.io/npm/v/asar.svg?style=flat-square)](https://npmjs.org/package/asar)
Asar is a simple extensive archive format, it works like `tar` that concatenates
all files together without compression, while having random access support.
## Features
* Support random access
* Use JSON to store files' information
* Very easy to write a parser
## Command line utility
### Install
```bash
$ npm install asar
```
### Usage
```bash
$ asar --help
Usage: asar [options] [command]
Commands:
pack|p <dir> <output>
create asar archive
list|l <archive>
list files of asar archive
extract-file|ef <archive> <filename>
extract one file from archive
extract|e <archive> <dest>
extract archive
Options:
-h, --help output usage information
-V, --version output the version number
```
#### Excluding multiple resources from being packed
Given:
```
app
(a) ├── x1
(b) ├── x2
(c) ├── y3
(d) │   ├── x1
(e) │   └── z1
(f) │   └── x2
(g) └── z4
(h) └── w1
```
Exclude: a, b
```bash
$ asar pack app app.asar --unpack-dir "{x1,x2}"
```
Exclude: a, b, d, f
```bash
$ asar pack app app.asar --unpack-dir "**/{x1,x2}"
```
Exclude: a, b, d, f, h
```bash
$ asar pack app app.asar --unpack-dir "{**/x1,**/x2,z4/w1}"
```
## Using programatically
### Example
```js
var asar = require('asar');
var src = 'some/path/';
var dest = 'name.asar';
asar.createPackage(src, dest, function() {
console.log('done.');
})
```
Please note that there is currently **no** error handling provided!
### Transform
You can pass in a `transform` option, that is a function, which either returns
nothing, or a `stream.Transform`. The latter will be used on files that will be
in the `.asar` file to transform them (e.g. compress).
```js
var asar = require('asar');
var src = 'some/path/';
var dest = 'name.asar';
function transform(filename) {
return new CustomTransformStream()
}
asar.createPackageWithOptions(src, dest, { transform: transform }, function() {
console.log('done.');
})
```
## Using with grunt
There is also an unofficial grunt plugin to generate asar archives at [bwin/grunt-asar][grunt-asar].
## Format
Asar uses [Pickle][pickle] to safely serialize binary value to file, there is
also a [node.js binding][node-pickle] of `Pickle` class.
The format of asar is very flat:
```
| UInt32: header_size | String: header | Bytes: file1 | ... | Bytes: file42 |
```
The `header_size` and `header` are serialized with [Pickle][pickle] class, and
`header_size`'s [Pickle][pickle] object is 8 bytes.
The `header` is a JSON string, and the `header_size` is the size of `header`'s
`Pickle` object.
Structure of `header` is something like this:
```json
{
"files": {
"tmp": {
"files": {}
},
"usr" : {
"files": {
"bin": {
"files": {
"ls": {
"offset": "0",
"size": 100,
"executable": true
},
"cd": {
"offset": "100",
"size": 100,
"executable": true
}
}
}
}
},
"etc": {
"files": {
"hosts": {
"offset": "200",
"size": 32
}
}
}
}
}
```
`offset` and `size` records the information to read the file from archive, the
`offset` starts from 0 so you have to manually add the size of `header_size` and
`header` to the `offset` to get the real offset of the file.
`offset` is a UINT64 number represented in string, because there is no way to
precisely represent UINT64 in JavaScript `Number`. `size` is a JavaScript
`Number` that is no larger than `Number.MAX_SAFE_INTEGER`, which has a value of
`9007199254740991` and is about 8PB in size. We didn't store `size` in UINT64
because file size in Node.js is represented as `Number` and it is not safe to
convert `Number` to UINT64.
[pickle]: https://chromium.googlesource.com/chromium/src/+/master/base/pickle.h
[node-pickle]: https://www.npmjs.org/package/chromium-pickle
[grunt-asar]: https://github.com/bwin/grunt-asar

72
node_modules/asar/bin/asar.js generated vendored Executable file
View File

@@ -0,0 +1,72 @@
#!/usr/bin/env node
var asar = require('../lib/asar')
var program = require('commander')
program.version('v' + require('../package.json').version)
.description('Manipulate asar archive files')
program.command('pack <dir> <output>')
.alias('p')
.description('create asar archive')
.option('--ordering <file path>', 'path to a text file for ordering contents')
.option('--unpack <expression>', 'do not pack files matching glob <expression>')
.option('--unpack-dir <expression>', 'do not pack dirs matching glob <expression> or starting with literal <expression>')
.option('--snapshot', 'create snapshot')
.option('--exclude-hidden', 'exclude hidden files')
.option('--sv <version>', '(snapshot) version of Electron')
.option('--sa <arch>', '(snapshot) arch of Electron')
.option('--sb <builddir>', '(snapshot) where to put downloaded files')
.action(function (dir, output, options) {
options = {
unpack: options.unpack,
unpackDir: options.unpackDir,
snapshot: options.snapshot,
ordering: options.ordering,
version: options.sv,
arch: options.sa,
builddir: options.sb,
dot: !options.excludeHidden
}
asar.createPackageWithOptions(dir, output, options, function (error) {
if (error) {
console.error(error.stack)
process.exit(1)
}
})
})
program.command('list <archive>')
.alias('l')
.description('list files of asar archive')
.action(function (archive) {
var files = asar.listPackage(archive)
for (var i in files) {
console.log(files[i])
}
})
program.command('extract-file <archive> <filename>')
.alias('ef')
.description('extract one file from archive')
.action(function (archive, filename) {
require('fs').writeFileSync(require('path').basename(filename),
asar.extractFile(archive, filename))
})
program.command('extract <archive> <dest>')
.alias('e')
.description('extract archive')
.action(function (archive, dest) {
asar.extractAll(archive, dest)
})
program.command('*')
.action(function (cmd) {
console.log('asar: \'%s\' is not an asar command. See \'asar --help\'.', cmd)
})
program.parse(process.argv)
if (program.args.length === 0) {
program.help()
}

232
node_modules/asar/lib/asar.js generated vendored Normal file
View File

@@ -0,0 +1,232 @@
'use strict'
const fs = process.versions.electron ? require('original-fs') : require('fs')
const path = require('path')
const minimatch = require('minimatch')
const mkdirp = require('mkdirp')
const Filesystem = require('./filesystem')
const disk = require('./disk')
const crawlFilesystem = require('./crawlfs')
const createSnapshot = require('./snapshot')
// Return whether or not a directory should be excluded from packing due to
// "--unpack-dir" option
//
// @param {string} path - diretory path to check
// @param {string} pattern - literal prefix [for backward compatibility] or glob pattern
// @param {array} unpackDirs - Array of directory paths previously marked as unpacked
//
const isUnpackDir = function (path, pattern, unpackDirs) {
if (path.indexOf(pattern) === 0 || minimatch(path, pattern)) {
if (unpackDirs.indexOf(path) === -1) {
unpackDirs.push(path)
}
return true
} else {
for (let i = 0; i < unpackDirs.length; i++) {
if (path.indexOf(unpackDirs[i]) === 0) {
return true
}
}
return false
}
}
module.exports.createPackage = function (src, dest, callback) {
return module.exports.createPackageWithOptions(src, dest, {}, callback)
}
module.exports.createPackageWithOptions = function (src, dest, options, callback) {
const globOptions = options.globOptions ? options.globOptions : {}
globOptions.dot = options.dot === undefined ? true : options.dot
let pattern = src + '/**/*'
if (options.pattern) {
pattern = src + options.pattern
}
return crawlFilesystem(pattern, globOptions, function (error, filenames, metadata) {
if (error) { return callback(error) }
module.exports.createPackageFromFiles(src, dest, filenames, metadata, options, callback)
})
}
/*
createPackageFromFiles - Create an asar-archive from a list of filenames
src: Base path. All files are relative to this.
dest: Archive filename (& path).
filenames: Array of filenames relative to src.
metadata: Object with filenames as keys and {type='directory|file|link', stat: fs.stat} as values. (Optional)
options: The options.
callback: The callback function. Accepts (err).
*/
module.exports.createPackageFromFiles = function (src, dest, filenames, metadata, options, callback) {
if (typeof metadata === 'undefined' || metadata === null) { metadata = {} }
const filesystem = new Filesystem(src)
const files = []
const unpackDirs = []
let filenamesSorted = []
if (options.ordering) {
const orderingFiles = fs.readFileSync(options.ordering).toString().split('\n').map(function (line) {
if (line.includes(':')) { line = line.split(':').pop() }
line = line.trim()
if (line.startsWith('/')) { line = line.slice(1) }
return line
})
const ordering = []
for (const file of orderingFiles) {
const pathComponents = file.split(path.sep)
let str = src
for (const pathComponent of pathComponents) {
str = path.join(str, pathComponent)
ordering.push(str)
}
}
let missing = 0
const total = filenames.length
for (const file of ordering) {
if (!filenamesSorted.includes(file) && filenames.includes(file)) {
filenamesSorted.push(file)
}
}
for (const file of filenames) {
if (!filenamesSorted.includes(file)) {
filenamesSorted.push(file)
missing += 1
}
}
console.log(`Ordering file has ${((total - missing) / total) * 100}% coverage.`)
} else {
filenamesSorted = filenames
}
const handleFile = function (filename, done) {
let file = metadata[filename]
let type
if (!file) {
const stat = fs.lstatSync(filename)
if (stat.isDirectory()) { type = 'directory' }
if (stat.isFile()) { type = 'file' }
if (stat.isSymbolicLink()) { type = 'link' }
file = {stat, type}
}
let shouldUnpack
switch (file.type) {
case 'directory':
shouldUnpack = options.unpackDir
? isUnpackDir(path.relative(src, filename), options.unpackDir, unpackDirs)
: false
filesystem.insertDirectory(filename, shouldUnpack)
break
case 'file':
shouldUnpack = false
if (options.unpack) {
shouldUnpack = minimatch(filename, options.unpack, {matchBase: true})
}
if (!shouldUnpack && options.unpackDir) {
const dirName = path.relative(src, path.dirname(filename))
shouldUnpack = isUnpackDir(dirName, options.unpackDir, unpackDirs)
}
files.push({filename: filename, unpack: shouldUnpack})
filesystem.insertFile(filename, shouldUnpack, file, options, done)
return
case 'link':
filesystem.insertLink(filename, file.stat)
break
}
return process.nextTick(done)
}
const insertsDone = function () {
return mkdirp(path.dirname(dest), function (error) {
if (error) { return callback(error) }
return disk.writeFilesystem(dest, filesystem, files, metadata, function (error) {
if (error) { return callback(error) }
if (options.snapshot) {
return createSnapshot(src, dest, filenames, metadata, options, callback)
} else {
return callback(null)
}
})
})
}
const names = filenamesSorted.slice()
const next = function (name) {
if (!name) { return insertsDone() }
return handleFile(name, function () {
return next(names.shift())
})
}
return next(names.shift())
}
module.exports.statFile = function (archive, filename, followLinks) {
const filesystem = disk.readFilesystemSync(archive)
return filesystem.getFile(filename, followLinks)
}
module.exports.listPackage = function (archive) {
return disk.readFilesystemSync(archive).listFiles()
}
module.exports.extractFile = function (archive, filename) {
const filesystem = disk.readFilesystemSync(archive)
return disk.readFileSync(filesystem, filename, filesystem.getFile(filename))
}
module.exports.extractAll = function (archive, dest) {
const filesystem = disk.readFilesystemSync(archive)
const filenames = filesystem.listFiles()
// under windows just extract links as regular files
const followLinks = process.platform === 'win32'
// create destination directory
mkdirp.sync(dest)
return filenames.map((filename) => {
filename = filename.substr(1) // get rid of leading slash
const destFilename = path.join(dest, filename)
const file = filesystem.getFile(filename, followLinks)
if (file.files) {
// it's a directory, create it and continue with the next entry
mkdirp.sync(destFilename)
} else if (file.link) {
// it's a symlink, create a symlink
const linkSrcPath = path.dirname(path.join(dest, file.link))
const linkDestPath = path.dirname(destFilename)
const relativePath = path.relative(linkDestPath, linkSrcPath);
// try to delete output file, because we can't overwrite a link
(() => {
try {
fs.unlinkSync(destFilename)
} catch (error) {}
})()
const linkTo = path.join(relativePath, path.basename(file.link))
fs.symlinkSync(linkTo, destFilename)
} else {
// it's a file, extract it
const content = disk.readFileSync(filesystem, filename, file)
fs.writeFileSync(destFilename, content)
}
})
}
module.exports.uncache = function (archive) {
return disk.uncacheFilesystem(archive)
}
module.exports.uncacheAll = function () {
disk.uncacheAll()
}

21
node_modules/asar/lib/crawlfs.js generated vendored Normal file
View File

@@ -0,0 +1,21 @@
'use strict'
const fs = process.versions.electron ? require('original-fs') : require('fs')
const glob = require('glob')
module.exports = function (dir, options, callback) {
const metadata = {}
return glob(dir, options, function (error, filenames) {
if (error) { return callback(error) }
for (const filename of filenames) {
const stat = fs.lstatSync(filename)
if (stat.isFile()) {
metadata[filename] = {type: 'file', stat: stat}
} else if (stat.isDirectory()) {
metadata[filename] = {type: 'directory', stat: stat}
} else if (stat.isSymbolicLink()) {
metadata[filename] = {type: 'link', stat: stat}
}
}
return callback(null, filenames, metadata)
})
}

134
node_modules/asar/lib/disk.js generated vendored Normal file
View File

@@ -0,0 +1,134 @@
'use strict'
const fs = process.versions.electron ? require('original-fs') : require('fs')
const path = require('path')
const mkdirp = require('mkdirp')
const pickle = require('chromium-pickle-js')
const Filesystem = require('./filesystem')
let filesystemCache = {}
const copyFileToSync = function (dest, src, filename) {
const srcFile = path.join(src, filename)
const targetFile = path.join(dest, filename)
const content = fs.readFileSync(srcFile)
const stats = fs.statSync(srcFile)
mkdirp.sync(path.dirname(targetFile))
return fs.writeFileSync(targetFile, content, {mode: stats.mode})
}
const writeFileListToStream = function (dest, filesystem, out, list, metadata, callback) {
for (let i = 0; i < list.length; i++) {
const file = list[i]
if (file.unpack) {
// the file should not be packed into archive.
const filename = path.relative(filesystem.src, file.filename)
try {
copyFileToSync(`${dest}.unpacked`, filesystem.src, filename)
} catch (error) {
return callback(error)
}
} else {
const tr = metadata[file.filename].transformed
const stream = fs.createReadStream((tr ? tr.path : file.filename))
stream.pipe(out, {end: false})
stream.on('error', callback)
return stream.on('end', function () {
return writeFileListToStream(dest, filesystem, out, list.slice(i + 1), metadata, callback)
})
}
}
out.end()
return callback(null)
}
module.exports.writeFilesystem = function (dest, filesystem, files, metadata, callback) {
let sizeBuf
let headerBuf
try {
const headerPickle = pickle.createEmpty()
headerPickle.writeString(JSON.stringify(filesystem.header))
headerBuf = headerPickle.toBuffer()
const sizePickle = pickle.createEmpty()
sizePickle.writeUInt32(headerBuf.length)
sizeBuf = sizePickle.toBuffer()
} catch (error) {
return callback(error)
}
const out = fs.createWriteStream(dest)
out.on('error', callback)
out.write(sizeBuf)
return out.write(headerBuf, function () {
return writeFileListToStream(dest, filesystem, out, files, metadata, callback)
})
}
module.exports.readArchiveHeaderSync = function (archive) {
const fd = fs.openSync(archive, 'r')
let size
let headerBuf
try {
const sizeBuf = new Buffer(8)
if (fs.readSync(fd, sizeBuf, 0, 8, null) !== 8) {
throw new Error('Unable to read header size')
}
const sizePickle = pickle.createFromBuffer(sizeBuf)
size = sizePickle.createIterator().readUInt32()
headerBuf = new Buffer(size)
if (fs.readSync(fd, headerBuf, 0, size, null) !== size) {
throw new Error('Unable to read header')
}
} finally {
fs.closeSync(fd)
}
const headerPickle = pickle.createFromBuffer(headerBuf)
const header = headerPickle.createIterator().readString()
return {header: JSON.parse(header), headerSize: size}
}
module.exports.readFilesystemSync = function (archive) {
if (!filesystemCache[archive]) {
const header = this.readArchiveHeaderSync(archive)
const filesystem = new Filesystem(archive)
filesystem.header = header.header
filesystem.headerSize = header.headerSize
filesystemCache[archive] = filesystem
}
return filesystemCache[archive]
}
module.exports.uncacheFilesystem = function (archive) {
if (filesystemCache[archive]) {
filesystemCache[archive] = undefined
return true
}
return false
}
module.exports.uncacheAll = function () {
filesystemCache = {}
}
module.exports.readFileSync = function (filesystem, filename, info) {
let buffer = new Buffer(info.size)
if (info.size <= 0) { return buffer }
if (info.unpacked) {
// it's an unpacked file, copy it.
buffer = fs.readFileSync(path.join(`${filesystem.src}.unpacked`, filename))
} else {
// Node throws an exception when reading 0 bytes into a 0-size buffer,
// so we short-circuit the read in this case.
const fd = fs.openSync(filesystem.src, 'r')
try {
const offset = 8 + filesystem.headerSize + parseInt(info.offset)
fs.readSync(fd, buffer, 0, info.size, offset)
} finally {
fs.closeSync(fd)
}
}
return buffer
}

151
node_modules/asar/lib/filesystem.js generated vendored Normal file
View File

@@ -0,0 +1,151 @@
'use strict'
const fs = process.versions.electron ? require('original-fs') : require('fs')
const path = require('path')
const tmp = require('tmp')
const UINT64 = require('cuint').UINT64
class Filesystem {
constructor (src) {
this.src = path.resolve(src)
this.header = {files: {}}
this.offset = UINT64(0)
}
searchNodeFromDirectory (p) {
let json = this.header
const dirs = p.split(path.sep)
for (const dir of dirs) {
if (dir !== '.') {
json = json.files[dir]
}
}
return json
}
searchNodeFromPath (p) {
p = path.relative(this.src, p)
if (!p) { return this.header }
const name = path.basename(p)
const node = this.searchNodeFromDirectory(path.dirname(p))
if (node.files == null) {
node.files = {}
}
if (node.files[name] == null) {
node.files[name] = {}
}
return node.files[name]
}
insertDirectory (p, shouldUnpack) {
const node = this.searchNodeFromPath(p)
if (shouldUnpack) {
node.unpacked = shouldUnpack
}
node.files = {}
return node.files
}
insertFile (p, shouldUnpack, file, options, callback) {
const dirNode = this.searchNodeFromPath(path.dirname(p))
const node = this.searchNodeFromPath(p)
if (shouldUnpack || dirNode.unpacked) {
node.size = file.stat.size
node.unpacked = true
process.nextTick(callback)
return
}
const handler = () => {
const size = file.transformed ? file.transformed.stat.size : file.stat.size
// JavaScript can not precisely present integers >= UINT32_MAX.
if (size > 4294967295) {
throw new Error(`${p}: file size can not be larger than 4.2GB`)
}
node.size = size
node.offset = this.offset.toString()
if (process.platform !== 'win32' && (file.stat.mode & 0o100)) {
node.executable = true
}
this.offset.add(UINT64(size))
return callback()
}
const tr = options.transform && options.transform(p)
if (tr) {
return tmp.file(function (err, path) {
if (err) { return handler() }
const out = fs.createWriteStream(path)
const stream = fs.createReadStream(p)
stream.pipe(tr).pipe(out)
return out.on('close', function () {
file.transformed = {
path,
stat: fs.lstatSync(path)
}
return handler()
})
})
} else {
return process.nextTick(handler)
}
}
insertLink (p, stat) {
const link = path.relative(fs.realpathSync(this.src), fs.realpathSync(p))
if (link.substr(0, 2) === '..') {
throw new Error(`${p}: file links out of the package`)
}
const node = this.searchNodeFromPath(p)
node.link = link
return link
}
listFiles () {
const files = []
const fillFilesFromHeader = function (p, json) {
if (!json.files) {
return
}
return (() => {
const result = []
for (const f in json.files) {
const fullPath = path.join(p, f)
files.push(fullPath)
result.push(fillFilesFromHeader(fullPath, json.files[f]))
}
return result
})()
}
fillFilesFromHeader('/', this.header)
return files
}
getNode (p) {
const node = this.searchNodeFromDirectory(path.dirname(p))
const name = path.basename(p)
if (name) {
return node.files[name]
} else {
return node
}
}
getFile (p, followLinks) {
followLinks = typeof followLinks === 'undefined' ? true : followLinks
const info = this.getNode(p)
// if followLinks is false we don't resolve symlinks
if (info.link && followLinks) {
return this.getFile(info.link)
} else {
return info
}
}
}
module.exports = Filesystem

62
node_modules/asar/lib/snapshot.js generated vendored Normal file
View File

@@ -0,0 +1,62 @@
'use strict'
const fs = process.versions.electron ? require('original-fs') : require('fs')
const path = require('path')
const mksnapshot = require('mksnapshot')
const vm = require('vm')
const stripBOM = function (content) {
if (content.charCodeAt(0) === 0xFEFF) {
content = content.slice(1)
}
return content
}
const wrapModuleCode = function (script) {
script = script.replace(/^#!.*/, '')
return `(function(exports, require, module, __filename, __dirname) { ${script} \n});`
}
const dumpObjectToJS = function (content) {
let result = 'var __ATOM_SHELL_SNAPSHOT = {\n'
for (const filename in content) {
const func = content[filename].toString()
result += ` '${filename}': ${func},\n`
}
result += '};\n'
return result
}
const createSnapshot = function (src, dest, filenames, metadata, options, callback) {
const content = {}
try {
src = path.resolve(src)
for (const filename of filenames) {
const file = metadata[filename]
if ((file.type === 'file' || file.type === 'link') && filename.substr(-3) === '.js') {
const script = wrapModuleCode(stripBOM(fs.readFileSync(filename, 'utf8')))
const relativeFilename = path.relative(src, filename)
try {
const compiled = vm.runInThisContext(script, {filename: relativeFilename})
content[relativeFilename] = compiled
} catch (error) {
console.error('Ignoring ' + relativeFilename + ' for ' + error.name)
}
}
}
} catch (error) {
return callback(error)
}
// run mksnapshot
const str = dumpObjectToJS(content)
const version = options.version
const arch = options.arch
const builddir = options.builddir
let snapshotdir = options.snapshotdir
if (typeof snapshotdir === 'undefined' || snapshotdir === null) { snapshotdir = path.dirname(dest) }
const target = path.resolve(snapshotdir, 'snapshot_blob.bin')
return mksnapshot(str, target, version, arch, builddir, callback)
}
module.exports = createSnapshot

78
node_modules/asar/package.json generated vendored Normal file
View File

@@ -0,0 +1,78 @@
{
"_args": [
[
"asar@0.14.3",
"/home/s2/Documents/Code/gitlit"
]
],
"_development": true,
"_from": "asar@0.14.3",
"_id": "asar@0.14.3",
"_inBundle": false,
"_integrity": "sha512-+hNnVVDmYbv05We/a9knj/98w171+A94A9DNHj+3kXUr3ENTQoSEcfbJRvBBRHyOh4vukBYWujmHvvaMmQoQbg==",
"_location": "/asar",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
"raw": "asar@0.14.3",
"name": "asar",
"escapedName": "asar",
"rawSpec": "0.14.3",
"saveSpec": null,
"fetchSpec": "0.14.3"
},
"_requiredBy": [
"/electron-packager"
],
"_resolved": "https://registry.npmjs.org/asar/-/asar-0.14.3.tgz",
"_spec": "0.14.3",
"_where": "/home/s2/Documents/Code/gitlit",
"bin": {
"asar": "./bin/asar.js"
},
"bugs": {
"url": "https://github.com/electron/asar/issues"
},
"dependencies": {
"chromium-pickle-js": "^0.2.0",
"commander": "^2.9.0",
"cuint": "^0.2.1",
"glob": "^6.0.4",
"minimatch": "^3.0.3",
"mkdirp": "^0.5.0",
"mksnapshot": "^0.3.0",
"tmp": "0.0.28"
},
"description": "Creating Electron app packages",
"devDependencies": {
"electron": "^1.6.2",
"electron-mocha": "^3.4.0",
"lodash": "^4.2.1",
"mocha": "^2.0.1",
"rimraf": "^2.5.1",
"standard": "^8.6.0",
"xvfb-maybe": "^0.1.3"
},
"engines": {
"node": ">=4.6"
},
"homepage": "https://github.com/electron/asar",
"license": "MIT",
"main": "./lib/asar.js",
"name": "asar",
"repository": {
"type": "git",
"url": "git+https://github.com/electron/asar.git"
},
"scripts": {
"lint": "standard",
"test": "xvfb-maybe electron-mocha --reporter spec && mocha --reporter spec && npm run lint"
},
"standard": {
"env": {
"mocha": true
}
},
"version": "0.14.3"
}

18
node_modules/asar/snapcraft.yaml generated vendored Normal file
View File

@@ -0,0 +1,18 @@
name: asar
version: git
summary: Manipulate asar archive files
description: |
Asar is a simple extensive archive format, it works like tar that
concatenates all files together without compression, while having
random access support.
confinement: classic
parts:
asar:
plugin: nodejs
source: .
apps:
asar:
command: lib/node_modules/asar/bin/asar.js