New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Call for new maintainers/collaborators #108

Open
simonwalsh opened this Issue Nov 2, 2017 · 17 comments

Comments

Projects
None yet
5 participants
@simonwalsh
Collaborator

simonwalsh commented Nov 2, 2017

We are looking for maintainers and contributors as Dynamo as a group will not be continuing active maintainance and features development for Shopify Pipeline (and it's scaffolder create-shopify-pipeline).

If you are interested in doing so, please comment here and I will get in touch.

We will keep this project in the Dynamo organization if/when/until we need to move it away so as not to break any projects depending on it.

Let's keep Shopify Pipeline alive!

@simonwalsh simonwalsh self-assigned this Nov 2, 2017

@t-kelly

This comment has been minimized.

Show comment
Hide comment
@t-kelly

t-kelly Nov 6, 2017

Hey @simonwalsh -- I'm interested in integrating the work you've done into the shopify/slate project. What you and @nddery have created in this repo is mostly in the same direction we want to take Slate. Shopify-Pipeline would make an excellent starting point to expand upon and make even better for Slate v1.0!

t-kelly commented Nov 6, 2017

Hey @simonwalsh -- I'm interested in integrating the work you've done into the shopify/slate project. What you and @nddery have created in this repo is mostly in the same direction we want to take Slate. Shopify-Pipeline would make an excellent starting point to expand upon and make even better for Slate v1.0!

@nddery

This comment has been minimized.

Show comment
Hide comment
@nddery

nddery Nov 16, 2017

Collaborator

Just wanted to leave a note here -- @simonwalsh and I met with @t-kelly a couple of days ago to discuss this further and we’ve all agreed this would be the best path forward for both projects. @t-kelly has already begun work in this direction, so stay tuned for what's next.

@t-kelly, as we’ve said before, if there’s anything we can help with please let us know, we’re glad to help.

Collaborator

nddery commented Nov 16, 2017

Just wanted to leave a note here -- @simonwalsh and I met with @t-kelly a couple of days ago to discuss this further and we’ve all agreed this would be the best path forward for both projects. @t-kelly has already begun work in this direction, so stay tuned for what's next.

@t-kelly, as we’ve said before, if there’s anything we can help with please let us know, we’re glad to help.

@lastobelus

This comment has been minimized.

Show comment
Hide comment
@lastobelus

lastobelus Jan 28, 2018

@simonwalsh @t-kelly After using shopify-pipeline heavily for 2 months with collaborators on a largish shopify-plus customer's project, I'm not sure that it's the right path forward. For point of reference, I've also worked with a hand-rolled gulp based build system in the past. And, I'm probably going to be replacing shopify-pipeline with another gulp-based build system, handrolled to work on the shopify-pipeline directory structure so that I don't have to move files around. Again.

Points of Pain

  1. Build system in a separate package is a big productivity killer. Some things you can tweak/add in the user configs, but when you have to tweak shopify-pipeline, development is very slow. To switch to a local copy, you have to change your project package.json to point to it and run yarn install (takes time to run). Then, each time you make a change, you still have to increment version in shopify-pipeline package.json, and run yarn upgrade shopify-pipeline in your project. Then when you're done, you gotta switch project package.json, increment shopify-pipeline package.json and push it up, and yarn install again in your project.

  2. Abstraction & DRY are good in code, but should be avoided in configuration files. They turn configuration into magic black boxes that people avoid touching.

  3. HMR doesn't work over tunnels. This makes it useless for collaboration unless your client is in your office. Maybe someone can make HMR work with tunnels -- we spent a bunch of money on upwork, and got nowhere.

  4. Filename hashing & asset subdirectories seem great until the first time your client spends a weekend hotfixing a bunch of stuff using the shopify theme editor and you have to figure out how to merge it.

  5. In watch mode webpack is re-emitting all the js & css files and their maps on every incremental build, even if only a liquid file is changed.

  6. In watch mode webpack is re-emitting every snippet/theme with a local (ie, built) asset_url on every build, regardless of whether the asset it pointed at changed. The liquid-loader calls this.cacheable which should make this not happen -- it might be fixable, but I haven't dug in to that.

  7. Because of 4+5, a watch + upload strategy replacement for HMR is very slow, since the map files are large.

  8. What webpack watch reports to a custom plugin is kind of bizarre, and makes it difficult to reliably build a plugin to efficiently choose a subset of files to upload to shopify on a change. **

  9. The webpack ecosystem is fragmented & confused. Webpack may have won the space, but battleground is full of shellholes. It's reminiscent of working with java in the early 2000s.

  10. No clean way to add entry points. You need multiple entry points. You at least need search & checkout entry points for js & css, and it would be better to have template entry points (so that for example the home page isn't loading 250k of javascript that only the product page needs).

  11. It's really hard to find out simple things. For example, I've been trying to google up how to turn off filename hashing in watch mode. No luck so far.

** NOTES about webpack watch + custom plugins:
You get the list of chunks -- and its always ALL the js & css entry points (see #4) -- but not the paths to their built files, so you have to build them from the dist dir & current hash. You can get the asset files that the user changed from compilation.timestamps. You can find all the liquid files which were re-emitted (see #5) by walking compilation.assets and finding all the ones which don't have an emitted: true property. But the files with references to the entry points aren't in compilation anywhere. IF I can figure out how to get webpack watch not to re-emit everything on every change, and/or if I do some own hashing of files maybe I'll get to the point of having an efficient subset of files to automatically upload on a change. But, even counting sunk investment, it feels it would be more cost effective to just build out the system I need in gulp.

lastobelus commented Jan 28, 2018

@simonwalsh @t-kelly After using shopify-pipeline heavily for 2 months with collaborators on a largish shopify-plus customer's project, I'm not sure that it's the right path forward. For point of reference, I've also worked with a hand-rolled gulp based build system in the past. And, I'm probably going to be replacing shopify-pipeline with another gulp-based build system, handrolled to work on the shopify-pipeline directory structure so that I don't have to move files around. Again.

Points of Pain

  1. Build system in a separate package is a big productivity killer. Some things you can tweak/add in the user configs, but when you have to tweak shopify-pipeline, development is very slow. To switch to a local copy, you have to change your project package.json to point to it and run yarn install (takes time to run). Then, each time you make a change, you still have to increment version in shopify-pipeline package.json, and run yarn upgrade shopify-pipeline in your project. Then when you're done, you gotta switch project package.json, increment shopify-pipeline package.json and push it up, and yarn install again in your project.

  2. Abstraction & DRY are good in code, but should be avoided in configuration files. They turn configuration into magic black boxes that people avoid touching.

  3. HMR doesn't work over tunnels. This makes it useless for collaboration unless your client is in your office. Maybe someone can make HMR work with tunnels -- we spent a bunch of money on upwork, and got nowhere.

  4. Filename hashing & asset subdirectories seem great until the first time your client spends a weekend hotfixing a bunch of stuff using the shopify theme editor and you have to figure out how to merge it.

  5. In watch mode webpack is re-emitting all the js & css files and their maps on every incremental build, even if only a liquid file is changed.

  6. In watch mode webpack is re-emitting every snippet/theme with a local (ie, built) asset_url on every build, regardless of whether the asset it pointed at changed. The liquid-loader calls this.cacheable which should make this not happen -- it might be fixable, but I haven't dug in to that.

  7. Because of 4+5, a watch + upload strategy replacement for HMR is very slow, since the map files are large.

  8. What webpack watch reports to a custom plugin is kind of bizarre, and makes it difficult to reliably build a plugin to efficiently choose a subset of files to upload to shopify on a change. **

  9. The webpack ecosystem is fragmented & confused. Webpack may have won the space, but battleground is full of shellholes. It's reminiscent of working with java in the early 2000s.

  10. No clean way to add entry points. You need multiple entry points. You at least need search & checkout entry points for js & css, and it would be better to have template entry points (so that for example the home page isn't loading 250k of javascript that only the product page needs).

  11. It's really hard to find out simple things. For example, I've been trying to google up how to turn off filename hashing in watch mode. No luck so far.

** NOTES about webpack watch + custom plugins:
You get the list of chunks -- and its always ALL the js & css entry points (see #4) -- but not the paths to their built files, so you have to build them from the dist dir & current hash. You can get the asset files that the user changed from compilation.timestamps. You can find all the liquid files which were re-emitted (see #5) by walking compilation.assets and finding all the ones which don't have an emitted: true property. But the files with references to the entry points aren't in compilation anywhere. IF I can figure out how to get webpack watch not to re-emit everything on every change, and/or if I do some own hashing of files maybe I'll get to the point of having an efficient subset of files to automatically upload on a change. But, even counting sunk investment, it feels it would be more cost effective to just build out the system I need in gulp.

@shelldandy

This comment has been minimized.

Show comment
Hide comment
@shelldandy

shelldandy Jan 28, 2018

HMR doesn't work over tunnels.

😢

Filename hashing & asset subdirectories seem great until the first time your client spends a weekend hotfixing a bunch of stuff using the shopify theme editor and you have to figure out how to merge it.

lol, that's a classic one, apparently the underlying shopify-node api allows to pull those changes, but its best to educate the client to not do hotfixes on their own or give them tools to edit things without touching code (sections, superfields, etc) that's what has worked best for us to prevent the clients from touching the rendered code.

tl;dr shopify development is messy/hacky but we have to do our best with the tools we have.

PS: I did a mixture of using webpack strictly for JS and everything else with gulp, maybe it can work for you

https://github.com/Pixel2HTML/shopify-skeleton

shelldandy commented Jan 28, 2018

HMR doesn't work over tunnels.

😢

Filename hashing & asset subdirectories seem great until the first time your client spends a weekend hotfixing a bunch of stuff using the shopify theme editor and you have to figure out how to merge it.

lol, that's a classic one, apparently the underlying shopify-node api allows to pull those changes, but its best to educate the client to not do hotfixes on their own or give them tools to edit things without touching code (sections, superfields, etc) that's what has worked best for us to prevent the clients from touching the rendered code.

tl;dr shopify development is messy/hacky but we have to do our best with the tools we have.

PS: I did a mixture of using webpack strictly for JS and everything else with gulp, maybe it can work for you

https://github.com/Pixel2HTML/shopify-skeleton

@lastobelus

This comment has been minimized.

Show comment
Hide comment
@lastobelus

lastobelus Jan 29, 2018

Always-emitting js is solved by downgrading back to webpack 2.x, there are current open issues about this for webpack 3.x.

I found out about yarn link, which is speeding up my experimentation a lot! You have to add:

resolve: {
   symlinks: false
},
module: {
  rules: [
    {
      enforce: 'pre',
      test: /\.js$/,
      exclude: commonExcludes('/node_modules/'),
      loader: 'eslint-loader',
      options: {
        configFile: config.paths.eslintrc
      }
    },
    ...
  }
}

for it to work.

And, I have sucessfully turned off filename hashes:

output: {
  filename: '[name].js',
  chunkFilename: '[name].js'
}

plugins: {
  new ExtractTextPlugin('[name].css')
}

Part of what was frustrating me there was a zombie node watch process! It was taking just a little longer than the current watch, so it would overwrite the non-hash-filename version with the hash-filename version :/. No other signs of it running.

I've decided to give up on writing a webpack plugin to decipher what files actually changed & upload them to shopify, and instead just write a separate watcher that maintains a cache of md5 hashes of files in the dist dir and only uploads touched files whose hash have changed.

lastobelus commented Jan 29, 2018

Always-emitting js is solved by downgrading back to webpack 2.x, there are current open issues about this for webpack 3.x.

I found out about yarn link, which is speeding up my experimentation a lot! You have to add:

resolve: {
   symlinks: false
},
module: {
  rules: [
    {
      enforce: 'pre',
      test: /\.js$/,
      exclude: commonExcludes('/node_modules/'),
      loader: 'eslint-loader',
      options: {
        configFile: config.paths.eslintrc
      }
    },
    ...
  }
}

for it to work.

And, I have sucessfully turned off filename hashes:

output: {
  filename: '[name].js',
  chunkFilename: '[name].js'
}

plugins: {
  new ExtractTextPlugin('[name].css')
}

Part of what was frustrating me there was a zombie node watch process! It was taking just a little longer than the current watch, so it would overwrite the non-hash-filename version with the hash-filename version :/. No other signs of it running.

I've decided to give up on writing a webpack plugin to decipher what files actually changed & upload them to shopify, and instead just write a separate watcher that maintains a cache of md5 hashes of files in the dist dir and only uploads touched files whose hash have changed.

@lastobelus

This comment has been minimized.

Show comment
Hide comment
@lastobelus

lastobelus Jan 29, 2018

@mike3run That looks pretty great!

Have you tried running your browserify setup with a tunnel? I think browserify has a built-in tunnel feature using localtunnel.me, but I haven't tried it.

How would you apply it to an existing project?

lastobelus commented Jan 29, 2018

@mike3run That looks pretty great!

Have you tried running your browserify setup with a tunnel? I think browserify has a built-in tunnel feature using localtunnel.me, but I haven't tried it.

How would you apply it to an existing project?

@lastobelus

This comment has been minimized.

Show comment
Hide comment
@lastobelus

lastobelus Jan 29, 2018

@t-kelly If you do use a pure webpack build system, I recommend not using filename hashing. Shopify's CDN is already handling caching for us, so it's pointless, and when I turned it off build speed increased by around an order of magnitude.

Apparently it might be tricky to get sourcemaps working without filename hashing, but...they weren't working for me since the beginning anyway.

lastobelus commented Jan 29, 2018

@t-kelly If you do use a pure webpack build system, I recommend not using filename hashing. Shopify's CDN is already handling caching for us, so it's pointless, and when I turned it off build speed increased by around an order of magnitude.

Apparently it might be tricky to get sourcemaps working without filename hashing, but...they weren't working for me since the beginning anyway.

@shelldandy

This comment has been minimized.

Show comment
Hide comment
@shelldandy

shelldandy Jan 29, 2018

I don't understand the tunnel part you're saying, you mean a proxy? Currently everything works as a proxy without any live reload, so basically everything from src gets worked on and copied to a ./deploy dir where I have a gulp-changed to only upload files that change from there.

https://localhost:3000 maps to your theme preview

For existing projects I usually download the theme from shopify, run the scaffold and then just place everything accordingly. (images in images folder, fonts in fonts folder, js in js folder....)

Edit: just googled the localtunnel thing, seems good, but i don't really see the point when your dev theme is also accessible via the shopify admin on the preview section

You can even share it with:

https://{{SHOP_NAME}}.myshopify.com?preview_theme_id={{THEME_ID}}

shelldandy commented Jan 29, 2018

I don't understand the tunnel part you're saying, you mean a proxy? Currently everything works as a proxy without any live reload, so basically everything from src gets worked on and copied to a ./deploy dir where I have a gulp-changed to only upload files that change from there.

https://localhost:3000 maps to your theme preview

For existing projects I usually download the theme from shopify, run the scaffold and then just place everything accordingly. (images in images folder, fonts in fonts folder, js in js folder....)

Edit: just googled the localtunnel thing, seems good, but i don't really see the point when your dev theme is also accessible via the shopify admin on the preview section

You can even share it with:

https://{{SHOP_NAME}}.myshopify.com?preview_theme_id={{THEME_ID}}

@shelldandy

This comment has been minimized.

Show comment
Hide comment
@shelldandy

shelldandy commented Jan 29, 2018

@lastobelus

This comment has been minimized.

Show comment
Hide comment
@lastobelus

lastobelus Jan 29, 2018

@mike3run with browserify, the tunnel would just mean that if you were collaborating with someone not on your LAN, that their window would refresh when you made changes. As opposed to HMR, which results in a completely broken site for people who can't see your localhost.

lastobelus commented Jan 29, 2018

@mike3run with browserify, the tunnel would just mean that if you were collaborating with someone not on your LAN, that their window would refresh when you made changes. As opposed to HMR, which results in a completely broken site for people who can't see your localhost.

@shelldandy

This comment has been minimized.

Show comment
Hide comment
@shelldandy

shelldandy Jan 29, 2018

Ohh I see, well you can turn it on and try it, I don't have fast internet so when I turned it on it say it works but I guess its not uploading fast enough so Im getting a timeout issue on the tunnel url.

I'm more opinionated towards each developer having its own .env, theme and private api keys to prevent overwriting other people's changes.

shelldandy commented Jan 29, 2018

Ohh I see, well you can turn it on and try it, I don't have fast internet so when I turned it on it say it works but I guess its not uploading fast enough so Im getting a timeout issue on the tunnel url.

I'm more opinionated towards each developer having its own .env, theme and private api keys to prevent overwriting other people's changes.

@lastobelus

This comment has been minimized.

Show comment
Hide comment
@lastobelus

lastobelus Jan 29, 2018

@mike3run we use multiple themes, but the sticking point is more demoing & making changes with the end client.

lastobelus commented Jan 29, 2018

@mike3run we use multiple themes, but the sticking point is more demoing & making changes with the end client.

@shelldandy

This comment has been minimized.

Show comment
Hide comment
@shelldandy

shelldandy Jan 29, 2018

Maybe localtunnel doesn't work, I tested a react app and it works but the browsersync with proxy returns a 504 timeout error maybe related to:

localtunnel/localtunnel#82

I usually give the clients the myshopify theme preview url and they can refresh as we speak on the call, etc

shelldandy commented Jan 29, 2018

Maybe localtunnel doesn't work, I tested a react app and it works but the browsersync with proxy returns a 504 timeout error maybe related to:

localtunnel/localtunnel#82

I usually give the clients the myshopify theme preview url and they can refresh as we speak on the call, etc

@shelldandy

This comment has been minimized.

Show comment
Hide comment
@shelldandy

shelldandy Jan 29, 2018

localtunnel/localtunnel#81

Seems like the way to go is to use ngrok and not localtunnel 🤔

shelldandy commented Jan 29, 2018

localtunnel/localtunnel#81

Seems like the way to go is to use ngrok and not localtunnel 🤔

@nddery

This comment has been minimized.

Show comment
Hide comment
@nddery

nddery Jan 29, 2018

Collaborator

Hey - to speed up working with a local version of Shopify-Pipeline, you can update your project package.json to point to your local copy of the shopify-pipeline script. For example:

{
  "scripts": {
    "serve": "../shopify-pipeline/cli.js serve",
    "build": "../shopify-pipeline/cli.js build",
    "deploy": "../shopify-pipeline/cli.js build --deploy",
    "test": "../shopify-pipeline/cli.js test --coverage"
  }
}

This way, no need to update version, publish packages or deal with yarn/npm link.

Also, could you please discuss tunnelling in the issue already opened for that, or open new issues for any new issues that arise - thank you.

Collaborator

nddery commented Jan 29, 2018

Hey - to speed up working with a local version of Shopify-Pipeline, you can update your project package.json to point to your local copy of the shopify-pipeline script. For example:

{
  "scripts": {
    "serve": "../shopify-pipeline/cli.js serve",
    "build": "../shopify-pipeline/cli.js build",
    "deploy": "../shopify-pipeline/cli.js build --deploy",
    "test": "../shopify-pipeline/cli.js test --coverage"
  }
}

This way, no need to update version, publish packages or deal with yarn/npm link.

Also, could you please discuss tunnelling in the issue already opened for that, or open new issues for any new issues that arise - thank you.

@t-kelly

This comment has been minimized.

Show comment
Hide comment
@t-kelly

t-kelly Jan 29, 2018

@lastobelus lots of great feedback I'll take into consideration when developing Slate v1. Will be releasing a public Alpha of it in the following week or two. Will ping this channel when it happens.

t-kelly commented Jan 29, 2018

@lastobelus lots of great feedback I'll take into consideration when developing Slate v1. Will be releasing a public Alpha of it in the following week or two. Will ping this channel when it happens.

@lastobelus

This comment has been minimized.

Show comment
Hide comment
@lastobelus

lastobelus Jan 29, 2018

@t-kelly I let my own rant inspire me to spend a day messing with shopify-pipeline (in my fork, and got turnaround times with webpack watch mode to under 10 seconds for js, under 5 seconds for css, and under 3 seconds for snippet changes.

I used node limiter to take advantage of the api leaky bucket rate limiting and upload changes in parallel. It maintains a cache of file md5s, so that it only uploads files that have actually changed, which once I turned off filename hashing is down to a reasonable number per change.

I did a little testing of source maps and in watch mode cheap-module-eval-source-map gave the best overall turnaround time, with the slight disadvantage that you get filename+line number, but generated code not original code in the browser.

I added thread-loader in front of babel, but the gain was small (around half to one second).

I tried thread-loader in front of css loaders, but had trouble getting postcss-loader to work properly with it, and didn't pursue it because it didn't seem likely the benefit would be significant.

I strongly recommend using something like my uploader to do the uploads, in my tests OMM parallelizing shopify-themekit (calling it once per file in parallel) offered benefits all the way up to the api bucket size, and using node limiter makes it dirt simple. I watched memory consumption while doing 80 concurrent uploads, and it remained reasonable OMM. My uploader does use node-watch to do a recursive watch of the dist directory, so some work would be necessary to make it work on linux.

lastobelus commented Jan 29, 2018

@t-kelly I let my own rant inspire me to spend a day messing with shopify-pipeline (in my fork, and got turnaround times with webpack watch mode to under 10 seconds for js, under 5 seconds for css, and under 3 seconds for snippet changes.

I used node limiter to take advantage of the api leaky bucket rate limiting and upload changes in parallel. It maintains a cache of file md5s, so that it only uploads files that have actually changed, which once I turned off filename hashing is down to a reasonable number per change.

I did a little testing of source maps and in watch mode cheap-module-eval-source-map gave the best overall turnaround time, with the slight disadvantage that you get filename+line number, but generated code not original code in the browser.

I added thread-loader in front of babel, but the gain was small (around half to one second).

I tried thread-loader in front of css loaders, but had trouble getting postcss-loader to work properly with it, and didn't pursue it because it didn't seem likely the benefit would be significant.

I strongly recommend using something like my uploader to do the uploads, in my tests OMM parallelizing shopify-themekit (calling it once per file in parallel) offered benefits all the way up to the api bucket size, and using node limiter makes it dirt simple. I watched memory consumption while doing 80 concurrent uploads, and it remained reasonable OMM. My uploader does use node-watch to do a recursive watch of the dist directory, so some work would be necessary to make it work on linux.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment