The decentralization of dependencies is one of my favorite features from Deno. Something Deno also does is simplifies the process of publishing and managing dependencies. Any file online can independently can be included in another project, and only it's tree of dependencies will be pulled in. On the flip side using npm modules, if you were to require a single file, that used a single npm module you unfortunately have to include ALL npm dependencies in your project.
I would love a way to include url-imports in node, and I have a few thoughts about what that looks like.
- Do not use existing
require
orimport
keywords, use a third-party module, or, use a seperate command to run node. - Never fetch asynchronously at runtime, having a
url-import install
command that parses the file and downloads locks / files. - The need to accommodate for npm packages, given a url we have to scanning or resolve
package.json
,package-lock.json
,yarn.json
,yarn.lock
,npm-shrinkwrap.json
at every directory level. - The need to accommodate for
tsconfig.json
, searching for the file within the url structure at every directory level, and applying individual configs to specific files. - Locking hashes of all files / urls downloaded, throwing exceptions for mismatching hashes.
- Need to create vscode plugin to add type support.
The Vision
This is what it looks like if url-import
is a third-party module. Running url-import install
would download the file and do a couple of other checks:
import { urlImport } from 'url-import';
const file = urlImport('https://reggi.com/foo/bar.ts')
- Download
https://reggi.com/foo/bar.ts
to a common folder~/url-import
. - Parse
bar.ts
forrequire
,import
andurlImport
- If there are local dependencies, download those files.
- If there are package imports start checking for
package.json
.
- Check
https://reggi.com/foo/tsconfig.json
- If above not found check
https://reggi.com/tsconfig.json
- Save a
url-import.lock
in the current-working-directory and include a "snapshot" that looks something like this{ fileUrl, fileHash, tsconfigUrl, tsConfigHash, packageUrl, packageHash }
essentially, save all the urls used / found, and hash the contents of every file. This will allow us to confirm that the state can be replayed and track changes. - Check
https://reggi.com/foo/url-import.lock
- If above not found check
https://reggi.com/url-import.lock
- Pluck the npm modules in the files crawled and match them up with their relevant resolved
package.json
. - Confirm all hashes / locks
Conclusion
I'd love it if we would have a more robust system for managing dependencies in Node.js, and I wish that the Node.js team was interested in creating a system using urls, but this is really hard to do because of NPM and because mixing URL imports and NPM imports means making a lot of requests and crawling urls.
What do you think? Does Node.js need to step away from NPM? Should we all just switch to Deno?
Top comments (0)