Comments (20)
The overall workflow would go like this:
-
Write some JavaScript files in
public/
that make use of ES imports to require NPM packages. -
Run @pika/web, which would then scan
public/
(configurable) for all such files, find all the NPM imports, and emit individual packages topublic/web_modules
(configurable). -
Deploy all of
public/
, including its (hopefully very slim)web_modules/
subdirectory, which only contains the bare-minimum!
from snowpack.
Thanks for writing this up, this was first proposed in #49 and I really like the idea. There are actually two "scan your imports" related tasks here that I'd like tackle together, if possible:
- Scan your imports to detect which entrypoints need to be bundled
- Scan your imports to detect which exports you use off of each import, and then install a tree-shaken web_modules directory (ex: you
import {add} from 'lodash'
, and we tree-shake everything unused out oflodash.js
).
The one concern I have with both of these is, are we breaking the promise: "you only need to re-run @pika/web when your dependencies change?" Would that mean we should implement both as an opt-in flag to start? (actually, I think you could tie #2 to our --production
flag, since dev-time doesn't really need tree-shaking).
I'm going to be on vacation for the next two weeks, and won't have regular access to internet. Are you interested in taking a stab at this? If so, feel free to hold off until I get back, or do a first pass and I can commit to reviewing it as soon as I'm back online.
from snowpack.
@FredKSchott Hope you have a relaxing vacation. If before you leave you have any time to fit in some brainstorming, I'm not going anywhere, so feel free to join me.
To make sure I understand the two related import-scanning tasks:
- This one is to e.g. find "react" when it sees
import React from 'react';
right? - This one is to determine that you're only using lodash's
add
function when you doimport {add} from 'lodash'
and thus not include all of lodash, but let rollup treeshake the rest?
Good point that the second one may need to run when your code changes, even if your dependencies don't change. You may change the import to say import {add,startCase} from 'lodash'
and now your code won't work until you re-run @pika/web.
The way we would solve that using @pika/web right now would be to either (a) accept that you're getting the full lodash with no tree-shaking, or (b) be more specific within webDependencies, which results in a slimmer output but already requires an extra step.
So if we want that kind of added output efficiency, I don't think we're going to get away from having an extra build step. That said, it would be a good use-case for a web watcher. Besides, a good argument for a web watcher is that we already need a dev server anyway, since the file://
protocol doesn't allow script modules to work.
Thus @pika/web's new responsibility would be:
- Continuously watching your own (e.g. the user's) code,
- scanning it for changes to imports, and
- re-compiling used dependencies into web_modules as needed.
from snowpack.
And taking that last comment a step further:
- if Pika's sole responsibility is the overall facilitation of the creation of web apps that native ES Modules,
- but to make this facilitation convenient, we need a watcher to scan our imports,
- and users would need a local dev server anyway, since ES Modules aren't compatible with the file:// protocol,
- then it seems Pika should serve both roles, and give the user a one-stop-shop for writing web apps that want to use ES Modules.
It would play a very similar role to e.g. create-react-app, where:
- you run the server,
- you write code,
- you install packages as-needed,
- it bundles your dependencies for you,
- but is unopinionated about how to bundle your code, leaving it up to you.
(Sorry, I may have went a little overboard with lists today. Oops.)
I think this is a good path forward for Pika as a very strong alternative to create-react-app, personally. Especially if we can get the React ecosystem working well via #62.
from snowpack.
Additionally, the entire time that Pika's watcher is running, any time that your code changes, your web_modules will always be up-to-date and ready for deployment. So you could quit the watcher immediately and just deploy, no need for a separate build phase like create-react-app has.
This use-case is made stronger by the fact that HTTP/2 is widely supported, so there's no issue about serving 20-50 dependencies as individual JavaScript files, and 50-100 of the user's own un-bundled JavaScript files. The user can add their own bundling step if they want, as you advocate, but they don't have to.
Overall I feel even more like this direction is strong competition with create-react-app.
from snowpack.
The only thing about this plan that I slightly don't like, is that user code would look like this:
import React from "/web_modules/react";
import styled, { keyframes } from "/web_modules/styled-components";
// ...
It would feel much better to have this:
import React from "react";
import styled, { keyframes } from "styled-components";
// ...
But maybe that's just because it's how I'm used to doing things in Node.js and create-react-app.
Either way, the first one would still be accurate, since dependencies would in fact show up inside web_modules after Pika is done compiling them for you.
And thus IDE support would still be feasible, even if it may sometimes need configuring where "root" would be (e.g. public/
).
However, there may be a use-case for a separate "build" mode to disable emitting source maps and disable minification and other optimizations. That's a bigger part of this plan I'm not a fan of. I like the simplicity of there being "no actual build step", but simply Pika saying "here let me compile all your dependencies for you live".
In the future this could be handled by a smarter package manager, which installs ESM-compatible modules directly into web_modules for you, but for now, that's the role Pika plays. (Although we'd still need a dev server anyway, because of the file://
+ ESM issue.)
from snowpack.
The "bare import" idea, import React from "react"
depends on a registry which is still under ECMA discussions. The idea is that the module loader has access to metadata that provides info for finding the module i.e. converting a bare name into a full local path.
I believe there are a few repos out there that claim to solve the registration problem, following the dormant ECMA spec. And workflow tools like webpack can do it but at the cost of a lot of workflow, and compiling JS every update.
Node has a nice resolution strategy, but is bound to package.json and common conventions. And rollup has rollup-plugin-node-resolve, but is limited to package.json metadata AFAIK.
https://github.com/standard-things/esm is a superb solution for importing modules into node. Possibly a similar loader could be used by us? I.e. insert a smart loader at the top of each module that can resolve the rest.
from snowpack.
An alternative to the Pika/watcher plan would be to just be okay with over-bundling during development phase, putting every module (not even just whitelisted) into web_modules, and only cleaning, minifying, and tree-shaking them during a production build. Then pika-web
(with --development
implied) could be hooked into a post-installed-package phase, and only when you want to actually release for production would you run pika-web --production
to optimize web_modules.
from snowpack.
@backspaces My comment you replied to was kind of tangential. I love the idea that Pika never touches or compiles your source files -- they exist as-is, and all Pika ever does is make your dependencies more convenient to use. But in order for user source code to remain as-is, its import paths need to be correct both before compilation of dependencies and after, i.e. for Pika to resolve dependencies and for the browser to resolve dependencies. But I'm mostly satisfied with the /web_modules/
-prefix solution that I suggested at first. I only mentioned it because it seems... not ideal. But good enough for the rest of the plan of this issue to move forward with.
from snowpack.
Since currently Pika scans either dependencies or webDependencies, it doesn't have code to scan a project's source code. So I prototyped out a little code that uses the "find-imports" library:
findImports(['public/**/*.js'], {
flatten: true,
absoluteImports: true,
relativeImports: false,
packageImports: true
})
.filter(s => s.startsWith('/web_modules/'))
.map(s => s
.replace(/^\/web_modules\//, '')
.replace(/.jsx?$/, ''))
This works well and correctly returns a value of [ 'htm/react', 'react', 'luxon', 'styled-components' ]
in my test project.
However, it also gives a SyntaxError
that says "Support for the experimental syntax 'dynamicImport' isn't currently enabled" and I can't find that option anywhere inside find-imports. I assumed maybe it was inside its dependency "esprima" but I couldn't find it in that project's documentation either.
The ability to also find dynamic import() expressions is very useful for this feature so it's worth looking into further.
from snowpack.
It just occurred to me that we can't reliably determine a path that's passed to dynamic import()
expressions, because they're dynamic. In retrospect this should have been obvious. This means we can only support import() statements if there's a string literal in there. Even though it's possible to pass a non-literal string, there are legitimate cases for passing a string literal, such as when the developer knows ahead of time what module to include, but wants to speed up load time by including it on-demand. This is most likely going to be the case with third-party modules, which probably won't ever be dynamically decided upon. So I think it's still safe to move forward with this.
from snowpack.
I could not get scanning of dynamic imports working using the find-imports package, but I did it to work using acorn, acorn-walk, and acorn-dynamic-import. But these will need file-finding (glob?) functionality to actually get the JS source to pass to acorn, which does not sound fun.
from snowpack.
Got the majority of this working in small independent snippets. Gonna try to tie it all together now in my fork.
from snowpack.
Just realized that my idea of scanning the project for .js files doesn't take into account embedded JS inside HTML (within a module script tag). Maybe there's a higher-level library that can scan all JS code, even embedded code, within a directory. This sounds like something webpack probably already knows how to do.
from snowpack.
I've gotten the vast majority of this feature working, and have tested it in the browser. All that's mainly left is to figure out how to implement cache busting (maybe versioning) to address #69. Will collaborate on that issue to see if we can find some reasonable solution. Besides that, I think both #68 and #62 are ready for final implementation followed by a PR!
from snowpack.
Using a file watcher (chokidar), my prototype is able to install dependencies as-needed, by scanning imports.
It looks for the package name by removing $prefix from beginning, removing ".js" from the end, and then finding the first path item. For example, given "/web_modules/styled-icons/fa-solid/Clipboard", it will give you "styled-icons".
But it doesn't take into account aliased NPM packages. For example, the sample code I've been testing my prototype on installs @react/esm in place of "react", like so:
npm i react@npm:@reactesm/react
So that all packages that import 'react'
are really importing @reactesm/react.
I figured that the Rollup plugin I described in #62 might be a good place to map this over, so that when Rollup sees "react", it would replace it with "@reactesm/react" no matter which file (mine or a dependency) imported "react".
That's the solution I'm in the middle of working on right now, in case anyone has any insights or thoughts about this.
Currently it's not working because it says:
(node:19986) UnhandledPromiseRejectionWarning: Error: Could not load @reactesm/react (imported by node_modules/styled-components/dist/styled-components.browser.cjs.js): ENOENT: no such file or directory, open '@reactesm/react'
It's clearly looking for a file, which makes sense for the other solution I mentioned in #62:
const aliasesOstensiblyFromPackageJson = {
'styled-components': 'node_modules/styled-components/dist/styled-components.browser.cjs.js',
};
function renameModuleAliases(aliases) {
return {
name: 'pika:rename-module-aliases',
resolveId(src, loader) {
return aliases[src] || null;
},
};
}
// ...
plugins: [
replaceProcessEnv(),
renameModuleAliases(aliasesOstensiblyFromPackageJson),
resolve(),
commonjs(),
],
// ...
So I guess what I need to do is make another Rollup plugin, one that maps over one dependency name to another dependency name, not file path.
from snowpack.
My somewhat-hacky solution to this is to install the dependencies like this:
const { execSync } = require('child_process');
const npmAliases = {
'react': 'npm:@reactesm/react',
'react-dom': 'npm:@reactesm/react-dom',
};
const depStrings = deps
.map(([name, version]) => makeName(name, npmAliases))
.unique()
.join(' ');
execSync(`npm install ${depStrings}`);
Which results in running this:
$ npm install chart.js \
htm \
react-dom@npm:@reactesm/react-dom \
react@npm:@reactesm/react \
styled-components \
styled-icons
This way it's more obvious how it works.
from snowpack.
After doing a little work to make sure this code only runs when a dependency isn't found on pkg.dependencies
(by name only, ignoring version for now), I finally have a fully working prototype of this concept I've been brainstorming up in this issue, #62, and #69.
It's currently at https://github.com/sdegutis/dep-bundler-prototype. I'm going to try to turn it into a NPM CLI utility and publish it to NPM. It's full of edge-cases because I only focused on the evergreen-path, since it's a proof-of-concept, but I think it's a good start.
Also, it's not a fork of this repo because it shares almost no code in common with it. Somehow, I was able to get a fully working prototype using only code found in these three issues and my own additions & glue code. But looking at the source to pika/web, it shares almost nothing in common, so I'm not exactly sure what pika/web does that my prototype doesn't do.
from snowpack.
Just pinging everyone to say that I haven't dropped this! I've also done some thinking on it since the original discussion and I think I have a suggestion for how toΒ implement this without breaking the promise of "only run when your dependencies change".
Because tree-shaking is a production optimization (the perf impact when serving local files in development should be minimal) lets add this "Import Analysis" feature behind the existing production --optimize
flag. That way, you continue to only run @pika/web when your dependencies change during development, and now optionally you can now run --optimize
every time you deploy to production to get that big tree-shaking perf win.
We'd probably need a new --optimize-from
/ option to set some glob matching for which files to analyze.
## Example: You write & serve JS from a "src/" directory to production
pika install --optimize --optimize-from src/**
## Example: You write TypeScript in a src/ directory, and then build that to a "static/js" directory
pika install --optimize --optimize-from static/js/**
from snowpack.
π This feature request has been moved!
Continue the discussion on our new message board:
https://www.pika.dev/packages/@pika/web/discuss/1092
π Pika Discuss: A Q&A discussion board for your JavaScript packages
from snowpack.
Related Issues (20)
- π BUG: snowpack doesn't correctly cache dependencies for packages with multiple package.json files when installing with pnpm
- π BUG: Can't quit dev server Win11 HOT 2
- π‘ RFC: Option to skip esbuild transform
- π BUG: hmr-client.js fails to load when inside of a srcdoc iframe
- Install snowpack for development failed. HOT 2
- π‘ RFC: Allow acess to hmr engine websocket service
- π BUG: [esinstall] duplicate export 'default' react
- π BUG: dev server fails with "Uncaught TypeError: require$$0 is not a function" due to incorrect require() transform
- π BUG: esinstall: πPutout support
- AMD requirejs module support (or docs on how to ..)
- π BUG: Skypack build fails if using modern JS features
- π BUG: `index.bin.js` file is mistaken as Trojan by Kaspersky and got deleted HOT 2
- π‘ RFC: Setting to hide deprecated options in the docs
- π BUG: default export problem
- π BUG: problem with default export / import in some cases HOT 1
- π BUG:
- π BUG: Snowpack can't find /src/index.jsx no matter what I do? HOT 1
- π BUG: Can't transpile files with new ES features such as optional chaining and private class fields
- π BUG: vm2 has been discontinued and has a critical vulnerability identified
- I can't install a template. HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from snowpack.