The current state of web application development
User expectations of the web are now that you have this super-smooth no-reload experience. Unfortunately, it's an expectation that is usually delivered with single-page applications (SPAs) that rely on libraries and frameworks like React and Angular, which are very specialised tools that can be complicated to work with.
A new approach is to put the ability to deliver this UX back into the hands of engineers that built websites before the SPA-craze, leveraging their existing toolsets and knowledge, and HTMX is the best example I've used so far.
The costs of SPA
SPAs have allowed engineers to create some great web applications, but they come with a cost:
-
Hugely increased complexity both in terms of architecture and developer experience. You have to spend considerable time learning about frameworks.
- Tooling is an ever-shifting landscape in terms of building and packaging code.
- Managing state on both the client and server
- Frameworks, on top of libraries, on top of other libraries, on top of polyfills. React even recommend using a framework on top of their tech:
React is a library. It lets you put components together, but it doesn’t prescribe how to do routing and data fetching. To build an entire app with React, we recommend a full-stack React framework.
-
By their nature, a fat client requires the client to execute a lot of JavaScript. If you have modern hardware, this is fine, but these applications will be unusable & slow for those on older hardware or in locations with slow and unreliable internet connections.
- It is very easy to make an SPA incorrectly, where you need to use the right approach with hooks to avoid ending up with abysmal client-side performance.
Some SPA implementations of SPA throw away progressive enhancement (a notable and noble exception is Remix). Therefore, you must have JavaScript turned on for most SPAs.
If you wish to use something other than JavaScript or TypeScript, you must traverse the treacherous road of transpilation.
It has created backend and frontend silos in many companies, carrying high coordination costs.
Before SPAs, you'd choose your preferred language and deliver HTML to a user's browser in response to HTTP requests. This is fine, but it offers little interactivity and, in some cases, could make an annoying-to-use UI, especially regarding having the page fully reload on every interaction. To get around this, you'd typically sprinkle varying amounts of JS to grease the UX wheels.
Whilst this approach can feel old-fashioned to some, this approach is what inspired the original paper of REST, especially concerning hypermedia. The hypermedia approach of building websites led to the world-wide-web being an incredible success.
Hypermedia?
The following is a response from a data API, not hypermedia.
{
"sort": "12-34-56",
"number": "87654321",
"balance": "123.45"
}
To make this data useful in an SPA, the code must understand the structure and decide what to render and what controls to make available.
REST describes the use of hypermedia. Hypermedia is where your responses are not just raw data but are instead a payload describing the media (think HTML tags like <p>
, headers, etc.) and how to manipulate it (like form
, input
).
A server returning HTML describing a bank account, with some form of controls to work with the resource, is an example of hypermedia. The server is now responsible for deciding how to render the data (with the slight caveat of CSS) and what controls should be displayed.
<dl>
<dt>Sort</dt><dd>12-34-56</dd>
<dt>Number</dt><dd>87654321</dd>
<dt>Balance</dt><dd>£123.45</dd>
</dl>
<form method="POST" action="/transfer-funds">
<label>Amount <input type="text" /></label>
<!-- etc -->
<input type="submit" value="Do transfer" />
</form>
The approach means you have one universal client, the web browser; it understands how to display the hypermedia responses and lets the user work with the "controls" to do whatever they need.
Carson Gross on The Go Time podcast
...when browsers first came out, this idea of one universal network client that could talk to any application over this crazy hypermedia technology was really, really novel. And it still is.
If you told someone in 1980, “You know what - you’re gonna be using the same piece of software to access your news, your bank, your calendar, this stuff called email, and all this stuff”, they would have looked at you cross-eyed, they wouldn't know what you were talking about, unless they happened to be in one of the small research groups that was looking into this sort of stuff.
Whilst ostensibly, people building SPAs talk about using "RESTful" APIs to provide data exchange to their client-side code, the approach is not RESTful in the purist sense because it does not use hypermedia.
Instead of one universal client, scores of developers create bespoke clients, which have to understand the raw data they fetch from web servers and then render controls according to the data. With this approach, the browser is more of a JavaScript, HTML and CSS runtime.
By definition, a fatter client will carry more effort and cost than a thin one. However, the "original" hypermedia approach arguably is not good enough for all of today's needs; the controls that the browser can work with and the way it requires a full page refresh to use them mean the user experience isn't good enough for many types of web-app we need to make.
HTMX and hypermedia
Unlike SPAs, HTMX doesn't throw away the architectural approach of REST; it augments the browser, improving its hypermedia capabilities and making it simpler to deliver a rich client experience without having to write much JavaScript if any at all.
You can use whatever programming language you like to deliver HTML, just like we used to. This means you can use battle-tested, mature tooling, using a "true RESTful" approach, resulting in a far more straightforward development approach with less accidental complexity.
HTMX allows you to design pages that fetch fragments of HTML from your server to update the user's page as needed without the annoying full-page load refresh.
We'll now see this in practice with the classic TODO-list application.
Clojure HTMX TODO
First-of-all, please don't get overly concerned with this being written in Clojure. I did it in Clojure for fun, but the beauty of this approach is that you can use whatever language you like, so long as it responds to HTTP requests.
Nothing special here, but it does feel like a SPA. There are no full-page reloads; it's buttery smooth, just like all the other SPA demos you would've seen.
The difference here is:
- I did not write any JavaScript.
- I also didn't cheat by transpiling Clojure into JavaScript. (see ClojureScript)
I made a web server that responds to HTTP requests with hypermedia.
HTMX adds the ability to define richer hypermedia by letting you annotate any HTML element to ask the browser to make HTTP requests to fetch fragments of HTML to put on the page.
The edit control
The most exciting and impressive part of this demo is the edit action. The way an input box instantly appears for you to edit and then quickly update it again feels like it would require either a lot of vanilla JS writing or a React-esque approach to achieve, but what you'll see is it's absurdly simple.
Let's start by looking at the markup for a TODO item. I have clipped the non-edit markup for clarity.
<li hx-target="closest li">
<form action="/todos/2a5e549c-c07e-4ed5-b7d4-731318987e05" method="GET">
<button hx-get="/todos/2a5e549c-c07e-4ed5-b7d4-731318987e05" hx-swap="outerHTML">📝</button>
</form>
</li>
It maybe looks a lot, but the main things to focus on for understanding how the edit functionality works:
- On the
<li>
, an attributehx-target
tells the browser, "When you get a fragment to render, this is the element I want you to replace". The children inherit this attribute, so for any HTMX actions inside this<li>
, theHTML
returned will replace the contents of the<li>
. -
hx-get
on the edit button means when you click it,HTMX
with tell the browser to do anHTTP GET
to theURL
and fetch some new markup to render to the<li>
in place of what's there. - The form is not essential for the example, but it allows us to support the functionality for non-JavaScript users, which will be covered later.
When you start working with HTMX, an easy way to understand what's going on is to look at the network in the browser's developer tools.
When a user clicks the edit button, the browser does an HTTP GET
to the specific todo resource. The server returns a hypermedia response, which is a representation of that resource with some hypermedia controls.
<form action="/todos/45850279-bf54-4e2e-a95c-c8c25866a744/edit"
hx-patch="/todos/45850279-bf54-4e2e-a95c-c8c25866a744" hx-swap="outerHTML" method="POST">
<input name="done" type="hidden" value="false"/>
<input name="name" type="text" value="Learn Rust"/>
<input type="submit"/>
</form>
HTMX then takes that HTML and replaces whatever we defined as the hx-target
. So the user now sees these hypermedia controls for them to manipulate the resource, instead of the row pictured before.
You'll notice the form has a hx-patch
attribute, which means when it is submitted, the browser will send a PATCH
with the data to update the resource. The server then responds with the updated item to render.
Embracing the web
There's more to HTMX, but this is the crux of the approach, which is the same as the approach that most websites were made before SPAs became popular.
- The user goes to a
URL
- The server returns hypermedia (HTML), which is content with controls.
- Browser renders hypermedia
- Users can use the controls to do work, which results in an HTTP request sent from the browser to the server.
- The server does business logic, and then returns new hypermedia for the user to work with
All HTMX does, is make the browser better at hypermedia by giving us more options regarding what can trigger an HTTP request and allowing us to update a part of the page rather than a full page reload.
By embracing the hypermedia and not viewing the browser as merely a JavaScript runtime, we get a lot of simplicity benefits:
- We can use any programming language.
- We don't need lots of libraries and other cruft to maintain what were basic benefits of web development.
- Caching
- SEO-friendliness
- The back button working as you'd expect
- etc.
- It is very easy to support users who do not wish to, or cannot use JavaScript
This final point is crucial to me and to my current employer. I work for a company that works on products used worldwide, and our content and tools must be as usable by as many people as possible. It is unacceptable for us to exclude people through poor technical choices.
This is why we adopt the approach of progressive enhancement.
Progressive enhancement is a design philosophy that provides a baseline of essential content and functionality to as many users as possible, while delivering the best possible experience only to users of the most modern browsers that can run all the required code.
All the features in the TODO app (search, adding, editing, deleting, marking as complete) all work with JavaScript turned off. HTMX doesn't do this for "free", it still requires engineering effort, but because of the approach, it is inherently simpler to achieve. It took me around an hour's effort and did not require significant changes.
How it supports non-JavaScript
When the browser sends a request that was prompted by HTMX, it adds a header HX-Request: true
, which means on the server, we can send different responses accordingly, very much like content negotiation.
The rule of thumb for a handler is roughly:
parseAndValidateRequest()
myBusinessLogic()
if request is htmx then
return hypermedia fragment
else
return a full page
end
Here's a concrete example of the HTTP handler for dealing with a new TODO:
(defn handle-new-todo [get-todos, add-todo]
(fn [req] (let [new-todo (-> req :params :todo-name)]
(add-todo new-todo)
(htmx-or-vanilla req
(view/todos-fragment (get-todos))
(redirect "/todos")))))
The third line is our "business logic", calling a function to add a new TODO to our list.
The fourth line is some code to determine what kind of request we're dealing with, and the subsequent lines either render a fragment to return or redirect to the page.
So far, this seems a recurring theme when I've been developing hypermedia applications with HTMX. By the very architectural nature, if you can support updating part of a page, return a fragment; otherwise, the browser needs to do a full page reload, so either redirect or just return the entire HTML.
HTML templating on the server is in an incredibly mature state. There are many options and excellent guides on how to structure and add automated tests for them. Importantly, they'll all offer some composition capabilities, so the effort to return a fragment or a whole page is extremely simple.
Why is it The Future ?
Obviously, I cannot predict the future, but I do believe HTMX (or something like it) will become an increasingly popular approach for making web applications in the following years.
Recently, HTMX was announced as one of 20 projects in the GitHub Accelerator
It makes "the frontend" more accessible.
Learning React is an industry in itself. It moves quickly and changes, and there are tons to learn. I sympathise with developers who used to make fully-fledged applications being put off by modern frontend development and instead were happy to be pigeonholed into being a "backend" dev.
I've made reasonably complex systems in React, and whilst some of it was pretty fun, the amount you have to learn to be effective is unreasonable for most applications. React has its place, but it's overkill for many web applications.
The hypermedia approach with HTMX is not hard to grasp, especially if you have some REST fundamentals (which many "backend" devs should have). It opens up making rich websites to a broader group of people who don't want to learn how to use a framework and then keep up with its constantly shifting landscape.
Less churn
Even after over 10 years of React being around, it still doesn't feel settled and mature. A few years ago, hooks were the new-fangled thing that everyone had to learn and re-write all their components with. In the last six months, my Twitter feed has been awash with debates and tutorials about this new-fangled "RSC" - react server components. Joy emoji.
Working with HTMX has allowed me to leverage things I learned 15-20 years ago that still work, like my website. The approach is also well-understood and documented, and the best practices are independent of programming languages and frameworks.
I have made the example app in Go and Clojure with no trouble at all, and I am a complete Clojure novice. Once you've figured out the basic syntax of a language and learned how to respond to HTTP requests, you have enough to get going; and you can re-use the architectural and design best practices without having to learn a new approach over and over again.
How much of your skills would be transferable from React if you had to work with Angular? Is it easy to switch from one react framework to another? How did you feel when class components became "bad", and everyone wanted you to use hooks instead?
Cheaper
It's just less effort!
Hotwire is a library with similar goals to HTMX, driven by the Ruby on Rails world. DHH tweeted the following.
That's why it's so depressing to hear the term "full stack" be used as a derogative. Or an impossible mission. That we HAVE to be a scattered band of frontend vs backend vs services vs whatever group of specialists to do cool shit. Absolutely fucking not.
Without the cognitive overload of understanding a vast framework from the SPA world and the inherent complexities of making a fat client, you can realistically create rich web applications with far fewer engineers.
More resilient
As described earlier, using the hypermedia approach, making a web application that works without JavaScript is relatively simple.
It's also important to remember that the browser is an untrusted environment, so when you build a SPA, you have to work extremely defensively. You have to implement lots of business logic client side; but because of the architecture, this same logic needs to be replicated on the server too.
For instance, let's say we wanted a rule saying you cannot edit a to-do if it is marked as done. In an SPA world, I'd get raw JSON, and I'd have to have business logic to determine whether to render the edit button on the client code somewhere. However, if we wanted to ensure a user couldn't circumvent this, I'd have to have this same protection on the server. This sounds low-stakes and simple, but this complexity adds up, and the chance of misalignment increases.
With a hypermedia approach, the browser is "dumb" and doesn't need to worry about this. As a developer, I can capture this rule in one place, the server.
Reduced coordination complexity
The complexity of SPAs has created a shift into backend and frontend silos, which carries a cost.
The typical backend/frontend team divide causes a lot of inefficiencies in terms of teamwork, with hand-offs and miscommunication, and makes getting stuff done harder. Many people mistake individual efficiencies as the most critical metric and use that as justification for these silos. They see lots of PRs being merged, and lots of heat being generated, but ignoring the coordination costs.
For example, let's assume you want to add a new piece of data to a page or add a new button. For many teams, that'll involve meetings between teams to discuss and agree on the new API, creating fakes for the frontend team to use and finally coordinating releases.
In the hypermedia approach, you don't have this complexity at all. If you wish to add a button to the page, you can add it, and you don't need to coordinate efforts. You don't have to worry so much about API design. You are free to change the markup and content as you please.
Teams exchanging data via JSON can be extremely brittle without care and always carries a coordination cost. Tools like consumer-driven contracts can help, but this is just another tool, another thing to understand and another thing that goes wrong.
This is not to say there is no room for specialisation. I've worked on teams where the engineers built the web application "end to end", but we had people who were experts on semantic, accessible markup who helped us make sure the work we did was of good quality. It is incredibly freeing not to have to negotiate APIs and hand off work to one another to build a website.
More options
Rendering HTML on the server is a very well-trodden road. Many battle-tested and mature tools and libraries are available to generate HTML from the server in every mainstream programming language and most of the more niche ones.
Wrapping up
I encourage developers looking to reduce the costs and complexities of web application development to check out HTMX. If you've been reluctant to build websites due to the fair assessment that front-end development is difficult, HTMX can be a great option.
I'm not trying to claim that SPAs are now redundant; there will still be a real need for them when you need very sophisticated and fast interactions where a roundtrip to the server to get some markup won't be good enough.
In 2018 I asserted that a considerable number of web applications could be written with a far simpler technological approach than SPAs. Now with the likes of HTMX, this assertion carries even more weight. The frontend landscape is dominated by waiting for a new framework to relieve the problems of the previous framework you happened to be using. The SPA approach is inherently more complicated than a hypermedia approach, and piling on more tech might not be the answer, give hypermedia a go instead.
Check out some of the links below to learn more.
Further reading and listening
- The author of HTMX has written an excellent, free book, explaining hypermedia. It's an easy read and will challenge your beliefs on how to build web applications. If you've only ever created SPAs, this is an essential read.
- HTMX. The examples section, in particular, is very good in showing you what's possible. The essays are also great.
- I was lucky enough to be invited onto The GoTime podcast with the creator of HTMX, Carson Gross to discuss it! Even though it's a Go podcast, the majority of the conversation was about the hypermedia approach.
- The Go version was my first adventure with HTMX, creating the same todo list app described in this post
- I worked on The Clojure version with my colleague, Nicky
- DHH on Hotwire
- Progressive enhancement
- Five years ago, I wrote The Web I Want, where I bemoaned the spiralling costs of SPAs. It was originally prompted by watching my partner's 2-year-old ChromeBook grind to a halt on a popular website that really could've been static HTML. In the article, I discussed how I wished more of the web stuck to the basic hypermedia approach, rendering HTML on the server and using progressive enhancement to improve the experience. Reading back on this has made me very relieved the likes of HTMX have arrived.
Top comments (58)
I have checked the htmx-Reference. There are about 120 new keywords introduced to do things, that can be easily done with Javascript. What is so bad about Javascript? As far as I have seen, you cannot even count from 1 to 10 with htmx......
You seem to be mistaking htmx as a programming language. This post is also not explicitly about language, but about architectural approach.
I´m just asking, if you need a new tool to do things, that can already be done in a more efficient way, just to avoid Javascript? Does this make our life better or simpler?
Over the last years we have seen new tools every week, and each comes with a new syntax, new keywords, and new limitations.
Well, one of the points of the argument is, yes, it can make life simpler.
Also it's disingenuous to say it is purely to avoid writing JavaScript. It's a simpler approach by returning to RESTful principals, rather than fat clients.
With this logic we can say, do you really need a high-level language to compile to assembly, when you can already write assembly in a more efficient way, just to avoid assembly? Does this make our life simpler or better?
It is not really an homologous situation. @efpage is explicitly making this claim in the context of htmx adding 120 new keywords in order to achieve its goal. Their point being that this may have significant overhead in terms of complexity.
Whether that is correct is a separate issue, but you cannot really compare it to assembly vs high level languages, although I will posit that yes, whether or not a high level language makes things easier than assembly (or C, for a more realistic case) should be a metric that all high level languages are measured by.
...AND avoid writing JavaScript.
It was meant as a tongue-in-cheek comment. ;P
htmx offers a hypermedia-first counterpoint to the js maximalist trend of the last decade. Of course, one can always leverage javascript for islands of interactivity, where necessary.
There are some interesting pieces of misdirection here.
First one: "React even recommend using a framework on top of their tech".
Yes, React, the front end library, recommends using a full stack framework if you need to do server side rendering.
The second one is this claim: "The hypermedia approach of building websites led to the world-wide-web being an incredible success." Not sure if that is an assertion that can be made so casually. So many things contributed to the success of the web, and if this hypermedia paradise (that I'm not sure we ever had) had been instrumental to it, maybe it would've collapsed without it?
The third one is the waxing poetic about the browser as the magical app that allows you to do anything... as if this had happened in spite of JavaScript as opposed to because of it.
Ahh, it didn't happen because of javascript. Your claim is just ridiculous. You keep raising objections that are not substantiated.
Well, maybe not just because of JavaScript, but certainly many successful web applications rely heavily on it. You can't have Google Docs without JavaScript. You can't have live chat in the browser without JavaScript. You can't have sophisticated graphical interactions without JavaScript. Without it, you can have things like email clients, bank clients, news readers, etc. Yes. But Apps that did all that already existed even before the Web browser (CompuServe, anyone?) and the Web did not dethrone it because it was necessarily better, but because it was open. Now, was hypermedia instrumental to that opennes? Maybe. It surely is a smart way of building an open system, but it is not the only way.
To reiterate: my comment was directly related to the assertion that the Web browser was this unprecedented "universal client", which is only true in a meaningful way if you acknowledge the big boost the JavaScript layer adds to it. Without JS, the web is more like a universal BBS: a wonderful thing to have, for sure, and unprecedented in terms of scope... but hardly breaking new ground in terms of client side capabilities.
In fact, during the early 2000s embedded front-end technologies were striving to be the missing link between HTML and real interactivity. Things like Java applets and Flash. Eventually, through things like AJAX and the canvas tag, the browser managed to provide those interactions on its own (through JS), and it drove those complementary technologies away. But there's a reason they were seen as a necessary component in the first place.
I get your point now. I was like "other stuff existed before JavaScript".
It's only about terminology, but HTMX and Hotwire are SPA, whether you like it or not. They expect server-side rendered HTML though rather than doing client-side rendering like, e.g., React; but the fact that they switch content in-place without reloading the whole page make them SPAs.
Naming things is hard, SPA edition
Thomas Broyer ・ Mar 28 '23
About HTMX, I 100% agree with that progressive-enhancement approach, particularly for forms, edit-in-place, or possibly "load more" buttons (for navigation though, unless you have a very good reason not to –e.g. playing video or sound–, I think MPA is better)
No they are not SPAs, SPAs are not the tools you use in building your app. An app isn't a SPA because it uses React, it's because of how it's built. React doesn't expect server-side rendered HTML.
Your comment is confusing. I can't tell if you agree or disagree with me 😅
What I was saying: HTMX and Hotwire Turbo are for making SPAs (using HTML over the wire). React is for rendering (most likely CSR), and most if not all React frameworks and apps are SPAs too.
I am surprised, that nobody considered UnPoly (unpoly.com) as common reference of htmx.
In my understanding, the challenge is, to provide server side rendered-page simplicity with a modern UX.
Now you got three choices:
Do browsers have built-in HTMX support, or how does this work without javascript enabled? I can see how easily this approach can prevent needing to write any javascript for a new HTMX app, but how does it work with no javascript at all?
You need JavaScript for HTMX to work, after all it is a JavaScript library. However the architectural approach makes it simpler to do progressive enhancement, as mentioned in the text.
Check out the clojure repo in the links for an example
Got it, thank you for the reply! That clears up a lot for me. This seems like something that Browsers should have built-in support for!
It's what HTML was supposed to evolve into. Give it a try. After using it... kinda automatically clicked in my mind "this is what HTML is supposed to be"
There's a typo in your headline. It should be HTMX is A future - not HTMX is THE future.
What you are describing is a different separation of concerns that may, or may not, be applicable in someone's use case.
Here's a thought experiment - imagine if HTMX becomes as popular as React. Why will the coding world be a better place? Won't there be lots of articles describing the best way to structure your hypermedia responses? And won't there be a gazillion libraries for the best way to construct complex HTML on the server side (I mean you skip over the whole part about what it takes for the server to construct fragments)? And let's not forget the benchmarks and handling large data. And what about the misuse of tags? (In your very first example you are arguably misusing dl because dl is a description list (with dt for description term and dd for description , but you're abusing it as a key value pairing).
None of the above is a deal breaker - but it just reinforces that the future will be as messy as it is today. There will be some more good choices for us to pick from and some less-good-choices will fade into oblivion.
Yes, a fat client is more expensive than a thin client. But then an SSR backend is more expensive than a vanilla data backend. And what about the middle tier? And microservices? And loose coupling? And a lot of other things that people much smarter than me will be listing. There are way too many holes in this article for it to be a good argument for THE future.
Take the lack of SEO as a "ding" against other approaches as a key argument for SSR. The first question is always "what's my use case?" and SEO often isn't one for a large number of "traditonal applications" (I don't want my email or my bank account details to turn up on Google thanks all the same).
Sure, there are also plenty of apps where SEO is necessary/desirable. But that's just another example of why to NEVER believe an article that says (or implies) that there is one future.
Most of my "beef" is the way the article is written/positioned. HTMX may be an interesting approach for some use cases.
If you have a Design System that follows a Design Language:
This is why React/Svelte/Solid/Angular/Vue are so popular.
Although I don't agree with most of the article, this part made me think
I never considered how much time and effort me and my peers have to do to coordinate responsibilities and JSON contracts
That's why you have standard REST implementations like OData and GraphQL. The contract between frontend and backend should be standardized to make life easier for both parties involved.
I'm reading hypermedia.systems now, and while this is clearly an interesting resource and htmx an interesting technology, I'm baffled by some of the authors assessments of front end development... Or I would be, if I hadn't encountered them before in the wild by JavaScript developers. Namely, statements like this:
What? Yes, I know there are people who think this way... But they are wrong. The reason Node is good for backend development is because it is a robust technology coupled to a great programming language. Even server side rendering with React frameworks is primarily about optimization and SEO, not about code reuse or overcoming of any hurdles of having code on the UI and the server, which will always happen.
Which brings me to the htmx concept of decoupling, which is also weird: the claim being that by returning html/x, a hypermedia, since the client doesn't need to know about it in advance to render it, it decouples server and client. Which I guess it is true in a way, but it also couples them in the sense that now this can only be changed by altering the server side code. If I get html from the server then I cannot really change it all that much, but if I get an object I can parse it however I want and present it in different ways. The convenience here is mostly that one method empowers the server and the other empowers the client, and I think they both have their use cases, but saying that one of them is more decoupled than the other seems wrong to me.
Still, I'm glad I came across this resource, and will continue to read with a critical eye.
Yep, it's so obvious that HTMX is for backend centric developers that hate frontend.
These kinds of teams would have flocked to projects like: Bootstrap.js, Foundation.js.
Anything that helps them avoid writing .js or .css files.
Funny, I used this approach pretty much exactly in an application that was built before the SPA craze, using jQuery to power it. In the frontend I used regular HTML forms and links with data- attributes to declaratively add submit/click listeners that would make the POST/GET with AJAX, and in the backend I checked for the JSON request header and either returned an HTML partial, or rendered a full page for a GET request or returned a redirect for a POST request. It worked great and was considered progressive enhancement as the application still worked if Javascript was disabled.
Interesting how things are coming full circle!
Isn't HTMX something like Alpine.JS ?
They're similar in approach, but complementary: htmx is about navigation and form processing (everything that makes HTTP requests) whereas Alpine.JS is for every other kind of interactivity on your page. There's of course a bit of overlap, but they have different goals.
htmx.org/ more focused on AJAX, WebSocket, and Server Sent Events. hyperscript.org/ more focused on interactivity.
Both came from bigsky.software/
Some comments may only be visible to logged-in visitors. Sign in to view all comments.