Tony Messias

June 25, 2022

Laravel Frontend Without Depending on Node

There are many options these days when it comes to bundling assets. In Laravel, we had Laravel Mix, which was a wrapper around webpack, and now we're getting a newer take on this which uses Vue's Vite to bundle our frontend.

In a previous blog post I shared two new libraries that allow us to have a frontend setup in Laravel without depending on Node or NPM: TailwindCSS Laravel and Importmap Laravel.

Let's see how they work in a quick guide.

TailwindCSS Laravel

TailwindCSS is built on top of PostCSS, so it depends on Node. However, the Tailwind Labs team built a standalone CLI tool on top of Vercel's Pkg, which packages a Node.js CLI tool into a single executable binary file that can run anywhere even when there's no Node.js installed.

With that in place, the TailwindCSS Laravel is essentially a wrapper around that binary. You may install it like this:

composer require tonysm/tailwindcss-laravel

Next, you can download the binary running:

php artisan tailwindcss:download

Each Operating System (OS) will need a different binary, so that command detects your OS and CPU architecture and ensures the correct binary is downloaded for you. Make sure you add `tailwindcss` (or `tailwindcss.exe` if you're on Windows) to your `.gitignore` file, so you don't accidentally track the binary in your Git repository.

To build your CSS files, you may run:

php artisan tailwindcss:build

This should read your local `tailwindcss.config.js` and compile your CSS for you. It also creates a new `public/.tailwindcss-manifest.json` file, which only maps between the CSS file alias you're referencing and the correct location/filename.

You may import your compiled CSS file using the new `tailwindcss()` function in your Blade files, like so:

<link
  rel="stylesheet"
  href="{{ tailwindcss('css/app.css') }}" />

So, essentially, use `tailwindcss()` instead of `mix()` for your CSS files.

During development, you may want to keep a watcher running as you work on your views. For that, you may use the `tailwindcss:watch` command:

php artisan tailwindcss:watch

That should do the trick.

Since the Tailwind CLI now ships with the `postcss-import` plugin, we can even organize our CSS in different files and all.

One more thing: for production, you may want to run the build command with the `--prod` flag, which should minify the CSS files and append the file digest (a hash based on the file's content) to the filename. That's where the `public/.tailwindcss-manifest.json` file comes in handy. In our layout files, we only reference the CSS files by the alias, something like `/css/app.css`, and inside the Manifest, it will map that alias to something like `/css/app-83183718.css`. Every time a file's content changes, you get a new hash digest. That's useful so we can safely adopt an aggressive caching strategy on these files without worrying about bursting the cache when we go live.

Importmap Laravel

Have you ever wondered why we need bundlers? There are many reasons, actually. Browsers used to limit the number of concurrent requests it would make to download your assets in parallel, so having all of them in a single bundled file made sense to avoid getting the browser stuck waiting for all your files to download. Also, browsers were slow to adopt modern JavaScript syntax, so tools like Babel were born. With that, we get to write some good JavaScript using the new stuff and compile it back to worse JavaScript that old browsers can understand.

"So we need it," you may think. Turns out HTTP2 solved the parallel assets download limitation (read more here). Actually, having more files may even be better, since we can rely on an aggressive caching strategy for each file, so browsers would only download the files that actually changed.

Also, most modern browsers these days have caught up with modern JavaScript syntax, so we can ship the "good JavaScript" we write straight to the browser. Browsers also understand JavaScript's ES Modules (ESM), and with Import Maps, we can have a configuration script on our page that will instruct the browser where to load the ESM. So the code below works natively in the browser.

<script type="importmap">
{
  "imports": {
    "alpinejs": "https://ga.jspm.io/npm:alpinejs@3.10.2/dist/module.esm.js"
  }
}
</script>

<script type="module">
import Alpine from 'alpinejs';

window.Alpine = Alpine;

Alpine.start();
</script>

That's where the Importmap Laravel comes in. We can install it like so:

composer require tonysm/importmap-laravel
php artisan importmap:install

The install command will create your entry point file `routes/importmap.php` and will replace your JS scaffolding (assuming it was created in a fresh Laravel app or something like Jetstream/Breeze).

As hinted, you will have a single "entry point" for your import map configs located in `routes/importmap.php`. You may map Alpine to its ESM file like this:

<?php

use Tonysm\ImportmapLaravel\Facades\Importmap;

Importmap::pin("alpinejs", to: "https://ga.jspm.io/npm:alpinejs@3.10.2/dist/module.esm.js", preload: true);

That will work, but what about our own JS files? We can use the `pinAllFrom()` method from the Importmap Facade, which should map all files in our `resources/js` folder, creating import map config entries for all of them like so:

<?php

use Tonysm\ImportmapLaravel\Facades\Importmap;

Importmap::pinAllFrom("resources/js", to: "js/");

Importmap::pin("alpinejs", to: "https://ga.jspm.io/npm:alpinejs@3.10.2/dist/module.esm.js", preload: true);

We're now mapping all our JavaScript files and their dependencies to generate our import map config. To export it to the page, we can use the Blade component in our layout file like so:

<x-importmap-tags />

This should:

  1. Generate `link[rel=modulepreload]` tags for our preloads;
  2. Generate the `script[type=importmap]` with our importmap JSON;
  3. Import our main entry point file, which by default is our `app.js` file located in `resources/js`.

There's still one thing, though. By default, it will generate the file paths for our local JS files expecting them to be served by our web server, but in Laravel only the `public/` folder is supposed to be mapped publicly for security reasons. So, in development, we need one more step to create a symlink from our `resources/js` folder to `public/js` so our files can be loaded by the browser.

We need to add a new link path to our `config/filesystems.php` file:

<?php

return [
    // ...

    'links' => array_filter([
        public_path('storage') => 
            storage_path('app/public'),
        public_path('js') => env('APP_ENV')
            ? resource_path('js')
            : null,
    ]),
];

This ensures that when `APP_ENV=local`, we are mapping the `resources/js` path to our `public/js` folder. For production, we're going to use something else. Now, all we need to do is run the `storage:link` command to create the symlinks:

php artisan storage:link

Or, if you're using Laravel Sail, this:

sail artisan storage:link

Now we're all set to start developing.

Adding new external dependencies is a matter of running the pin command:

php artisan importmap:pin laravel-echo pusher-js

Would add pin entries for both `laravel-echo` and `pusher-js` to our `routes/importmap.php` file.

When it's time to deploy to production, you don't need to symlink the `resources/js` folder to `public/js`, instead, the package ships with an `importmap:optimize` command that you can use:

php artisan importmap:optimize

That command will read your import map configuration, ensure your JS files are located in `public/dist/js`, append a file digest (a hash based on the file's content) to the file name, something like `app-a123bf8.js`, and generate a `public/.importmap-manifest.json` with everything needed to generate your import map JSON again without scanning the files.

One little "gotcha" is that we're used to relying on `process.env` to pass down some configuration from our `.env` file to our JavaScript code. That's not possible anymore. It's something we lose from not using a bundler. But that's easily fixed by using meta tags and querying for them. So, to dynamically load Laravel Echo configurations into your JavaScript, for instance, you may add the following meta tags to your layout file's document head:

<meta
  name="echo-pusher-app-key"
  content="{{ config('broadcasting.connections.pusher.key') }}" />
<meta
  name="echo-pusher-use-tls"
  content="{{ config('broadcasting.connections.pusher.frontend.use_tls') }}" />
<meta
  name="echo-pusher-host"
  content="{{ config('broadcasting.connections.pusher.frontend.pusher_host') }}" />
<meta
  name="echo-pusher-port"
  content="{{ config('broadcasting.connections.pusher.frontend.pusher_port') }}" />

Then, in your `bootstrap.js` file, you may query the document for those configs like so:

import Echo from 'laravel-echo';
import Pusher from 'pusher-js';

window.Pusher = Pusher;

function metaContent(name) {
  return document.head.querySelector(`meta[name=${name}]`)?.content
}

window.Echo = new Echo({
  broadcaster: 'pusher',
  key: metaContent('echo-pusher-app-key'),
  forceTLS: metaContent('echo-pusher-use-tls') === "1",
  disableStats: true,
  wsHost: metaContent('echo-pusher-host') || window.location.host,
  wsPort: metaContent('echo-pusher-port') || null,
});

That should be it.

One more thing, make sure you're using the relative paths to files without the `./` on your imports, so this:

import 'boostrap';

Instead of this:

import './boostrap';

That's because of how the importmap JSON is mapped in the JSON script. You can see the generated JSON config from your terminal by running this command:

php artisan importmap:json

Also, if you have a file called `index.js` inside a folder, you may specify just the folder name. So, for a `libs/index.js` file, you may import it like:

import 'libs';

Before we wrap up, there are two commands to help you keep your dependencies in check:

php artisan importmap:outdated

Will give you a list of which JS dependencies you're using that have newer versions.

php artisan importmap:audit

Should tell you which of your dependencies have a security vulnerability. It's recommended to have this running on a regular basis, maybe scheduled on your CI to run every week or something.

Conclusion

With this article, I hope you can get an overview of what a Node-less life looks like. It's actually pretty cool. If you're using something like Alpine or Stimulus, that's all you might need, actually.

This approach won't work if you're using React with JSX or Vue with SFC, though, since those need to be compiled before being sent to the browser. Also, we need to police ourselves from practices like importing CSS files or images/SVGs straight from our JS files. That won't work, so import your CSS files like they should be imported: link them on your document's head tag.

Anyways, let me know what you think!

- Tony

About Tony Messias