Stolz / Assets

An ultra-simple-to-use assets management library for PHP

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Command to Compile Assets

rajivseelam opened this issue · comments

Hi,

The package minifies assets when we open in the browser, Is there any way for these assets to be compiled via a command line?

The workflow I am trying to implement is:

  1. When i run the command - All assets are compiled
  2. Uploaded to CDN

So, even if I can achieve this in some other way, please recommend, it would be great. Please let me know your thoughts.

When I am trying to serve from CDN, I guess I would also have to change relative paths via fetch_command, I am not entirely sure how would I do that either. I have noticed your previous comments on notify_command, It's entirely not clear how I should use it, It is called after files are compressed and minimized, so there should be some conditional variables based on my env?

This command works very well for me right now:

        'fetch_command' => function ($asset) {

            $content = file_get_contents($asset);

            $prefix = str_replace(public_path() . '/', '', '/'. dirname($asset)) . '/';

            // dd($prefix);

            $regex = "/\burl\(\s?[\'\"]?(([^';]+)\.(jpg|eot|jpg|jpeg|gif|png|ttf|woff|svg|otf).*?)[\'\"]?\s?\)/";

            $filter = function ($match) use ($prefix) {

                // Do not process absolute URLs
                if('http://' === substr($match[1], 0, 7) or 'https://' === substr($match[1], 0, 8) or '//' === substr($match[1], 0, 2))
                {
                    return $match[0];
                }

                // Add your filter logic here
                return 'url(\''.$prefix.$match[1] . '\')';
            };

            // Apply filter
            return preg_replace_callback($regex, $filter, $content);
        },
commented

There is no way to build the assets from the command line since the library is unaware of the application that uses it and assets could be added from anywhere in your application (controllers, views, presenters, middlewares, ...). The library is framework/application agnostic so it is unable to replicate your request cycle.

A possible approach to generate the assets via command line could be to visit with a command line tool all the routes that use assets. You have to figure out a way to parse your application routes and then visit them. If you use Laravel you could create an artisan command that parses the return value of Route::getRoutes() and then visit all the GET routes using Illuminate\Foundation\Testing\Concerns\MakesHttpRequests. Laravel even gives you traits to authenticate users in case some of your routes need authentication.

The notify_command is called ever time new pipelined assets are created, that is, every time new assets are compressed and minified. In other words, it's called every tiem a file is written to your min directories. If you visit a page that already had its assets pipelined, then notify_command won't be called. The logic of the notify_command should copy the newly created files to your CDN server.

I use Laravel and my workflow is:

All my CSS/JS files always use paths relative to the css_dir/js_dir folder (not relative to the folder they are in). All my assets are loaded using relative paths to the css_dir/js_dir folder. My master layout uses the base HTML tag pointing to /. That way not only the assets names/paths stay short/consistent but also by simply changing the css_dir and js_dir options I can host my assets anywhere and I don't have to bother to process them with fetch_command.

Since I have all my controllers tested by simply running my tests I get a fresh copy of my pipelined assets every time I run phpunit. My deploy script uses an artisan command that copies the content of my pipeline dirs to my CDN (Amazon S3) using Laravel Filesystem library. I don't use notify_command because I don't need to use the CDN on my development environment so I copy them only once, when I deploy, not every time they are generated.

In my file config/assets.php I have:

'pipeline' => env('ASSETS_PIPELINE', false),
'css_dir' => env('ASSETS_CSS_DIR', 'css'),
'jss_dir' => env('ASSETS_JS_DIR', 'js'),

In my development environment my .env file has:

ASSETS_PIPELINE=false
ASSETS_CSS_DIR=css
ASSETS_JS_DIR=js

In my production environment my .env file has:

ASSETS_PIPELINE=true
ASSETS_CSS_DIR=css
ASSETS_JS_DIR=js

If I wanted to use the CDN in my development environment I only need to update the .env file to:

ASSETS_PIPELINE=true
ASSETS_CSS_DIR=http://address.of.my.cdn/css
ASSETS_JS_DIR=http://address.of.my.cdn/js

If your CSS/JS files don't always use paths relative to the css_dir/js_dir folder then you have to use fetch_command to alter the paths.

Hi,

Thanks for the detailed reply. I will try the workflow you are using.

Hi,

This is what I tried:

With following config, I have run my tests:

ASSETS_PIPELINE=true
ASSETS_CSS_DIR=css
ASSETS_JS_DIR=js

Minified assets are generated. And all are uploaded to S3.

Now, I change config to:

ASSETS_PIPELINE=true
ASSETS_CSS_DIR=http://address.of.my.s3/css
ASSETS_JS_DIR=http://address.of.my.s3/js

At this point I have already uploaded all files, and now If i try to open my browser, it again tried to compile/minify assets.

Am i missing some step? Please let me know.

Hi,

Whenever I change css_dir and js_dir (point it to s3) - It compiles a different hash, and i don't have those files uploaded to s3 at that point because, what I upload to s3 are minified files using local env with pipeline set to true.