DEV Community

loading...
Cover image for Autocomplete input + caching results with Phoenix LiveView

Autocomplete input + caching results with Phoenix LiveView

Santiago Cardona
Hello! I'm Santiago, Software Engineer and Web Enthusiast with experience in backend using Elixir/Phoenix and Node.js. I also enjoy doing some frontend with React.js
Updated on ・8 min read

In this post, I want to share with you all the powerful machinery I've used to implement a geolocation search input with autocomplete using Here Maps API. With this approach, I've also implemented a browser's geolocation to improve precision, as well as results caching for better performance and cost saving when hitting geolocation third-party backends.

input autocompletion gif

First thing first, I have to say that for this post I took a huge amount of inspiration and code from the fantastic book Programming Phoenix 1.4 by Chris McCord, Bruce Tate, and José Valim. Especially, the cache and API request parts. So if you want to have a deeper explanation don't doubt to read it from the source.

Setting Up the Search Form View

<form phx-change="suggest" phx-submit="search">

      <input
        type="text"
        name="origin"
        value="<%= @origin %>"
        placeholder="Origen"
        list="origins"
        autocomplete="off"
        phx-debounce="700"
      />
      <datalist id="origins">
        <%= for place <- @origins do %>
          <option value="<%= place.title %>"><%= place.title %></option>
        <% end %>
      </datalist>

      <input
        type="text"
        name="destination"
        value="<%= @destination %>"
        placeholder="Destino"
        list="destinations"
        autocomplete="off"
        phx-debounce="700"
      />
      <datalist id="destinations">
        <%= for place <- @destinations do %>
          <option value="<%= place.title %>"><%= place.title %></option>
        <% end %>
      </datalist>

      <%= live_component @socket, CarpoolingWeb.PositionComponent, id: "position" %>

      <button type="submit" phx-disable-with="Buscando...">Buscar</button>

</form>
Enter fullscreen mode Exit fullscreen mode

This form uses a LiveView Component to get the browser's position and using that position (latitude and longitude) as part of the params we sent from the form:

defmodule CarpoolingWeb.PositionComponent do
  use CarpoolingWeb, :live_component

  def mount(socket) do
    {:ok, socket}
  end

  def render(assigns) do
    ~L"""
    <input
      id="current-position"
      name="position"
      type="hidden"
      phx-update="ignore"
      phx-hook="SetCurrentPosition"
    >
    """
  end
end
Enter fullscreen mode Exit fullscreen mode

In this PositionComponent we use a phx-hook to have JavaScript interoperability by accessing the navigator geolocation API and get the current geolocation's position from the user's browser. Also, we use phx-update since we don't want to trigger the phx-hook every time the form changes any of its inputs.

We set up the hook in app.js, in this case, the one called SetCurrentPosition and then we add it as part of the liveSocket:

const setPosition = ({ coords }) => {
  const x = document.getElementById("current-position")
  x.value = coords.latitude + "," + coords.longitude
}

const Hooks = {
  SetCurrentPosition: {
    mounted() {
      if (navigator.geolocation) {
        navigator.geolocation.getCurrentPosition(setPosition)
      }
    }
  }
}

const csrfToken = document.querySelector("meta[name='csrf-token']").getAttribute("content")
const liveSocket = new LiveSocket("/live", Socket, {
  hooks: Hooks,
  params: { _csrf_token: csrfToken }
})
Enter fullscreen mode Exit fullscreen mode

Setting Up the LiveView

defmodule CarpoolingWeb.RideLive.Index do
  use CarpoolingWeb, :live_view

  alias Carpooling.Locations

  @impl true
  def mount(_params, _session, socket) do
    {:ok,
     assign(socket,
       origin: "",
       origins: [],
       destination: "",
       destinations: [],
     )}
  end

  @impl true
  def handle_params(params, _url, socket) do
    {:noreply, apply_action(socket, socket.assigns.live_action, params)}
  end

  @impl true
  def handle_event("suggest", params, socket) do
    %{
      "origin" => origin,
      "destination" => destination,
      "position" => position
    } = params

    origins =
      Locations.get_locations(origin, position)

    destinations =
      Locations.get_locations(destination, position)

    {:noreply,
     assign(socket,
       origins: origins,
       origin: origin,
       destinations: destinations,
       destination: destination,
     )}
  end

  defp apply_action(socket, :index, _params) do
    socket
    |> assign(:page_title, "Transporte Solidario")
  end
end
Enter fullscreen mode Exit fullscreen mode

In the mount callback we initialize origins and destinations fields as empty lists to render the form view and hoping to fill those every time the user types in the location text inputs.

We have a handle_event callback that listens for a "suggest" event to be triggered every time the form changes (this is triggered by the form's phx-change binding). Here we use Locations.get_locations/2 function in order to obtain the origins and destinations lists, and then update those lists in the socket fields.

At this point, we can see how Phoenix LiveView manages to handle the client logic very smoothly 😍. Just by using a couple of LiveView Bindings we got autocomplete inputs ready.

Implementing the Locations Module (Locations Context: Geolocation API Requests + Caching Results)

Here is where the magic occurs. The Locations Context is the one responsible for doing the HTTP requests to get the list of possible locations that matches with the user's input, but also responsible for caching these possible results.

defmodule Carpooling.Locations do
  @backends [Carpooling.Locations.HereMaps]

  defmodule Result do
    defstruct backend: nil, locations: nil
  end

  alias Carpooling.Locations.Cache

  def get_locations(query, point), do: search(query, point)

  defp search(query, point) do
    if String.length(query) >= 5 do
      compute(query, point, [])
      |> Enum.map(fn item -> item.locations end)
      |> List.flatten()
    else
      []
    end
  end

  defp compute(query, point, opts \\ []) do
    timeout = opts[:timeout] || 10_000
    opts = Keyword.put_new(opts, :limit, 10)
    backends = opts[:backends] || @backends

    {uncached_backends, cached_results} = fetch_cached_results(backends, query, opts)

    uncached_backends
    |> Enum.map(&async_query(&1, query, point, opts))
    |> Task.yield_many(timeout)
    |> Enum.map(fn {task, res} ->
      res || Task.shutdown(task, :brutal_kill)
    end)
    |> Enum.flat_map(fn
      {:ok, results} -> results
      _ -> []
    end)
    |> write_results_to_cache(query, opts)
    |> Kernel.++(cached_results)
    |> Enum.take(opts[:limit])
  end

  defp fetch_cached_results(backends, query, opts) do
    {uncached_backends, results} =
      Enum.reduce(
        backends,
        {[], []},
        fn backend, {uncached_backends, acc_results} ->
          case Cache.fetch({backend.name(), query, opts[:limit]}) do
            {:ok, results} ->
              {uncached_backends, [results | acc_results]}

            :error ->
              {[backend | uncached_backends], acc_results}
          end
        end
      )

    {uncached_backends, List.flatten(results)}
  end

  defp async_query(backend, query, point, opts) do
    Task.Supervisor.async_nolink(
      Carpooling.TaskSupervisor,
      backend,
      :compute,
      [query, point, opts],
      shutdown: :brutal_kill
    )
  end

  defp write_results_to_cache(results, query, opts) do
    Enum.map(results, fn %Result{backend: backend} = result ->
      :ok = Cache.put({backend.name(), query, opts[:limit]}, result)

      result
    end)
  end
end
Enter fullscreen mode Exit fullscreen mode

In this case, I'm using Here Maps API to get possible locations for the autocomplete's suggestions. But we can integrate more APIs such as Google Maps just by implementing a particular module for each API + results parsing and including them as part of the @backends list. Then, we include a Result module to identify each different backend's results in case we need it someway in the future.

We have four important functions here: compute, fetch_cached_results, async_query and write_results_to_cache. Its names are pretty declarative in what each one of them do.

The compute function is the one that invokes the others, but also the one where we set up different configurations such as results limitation, requests timeouts, and so on. This function is going to process cached and uncached results to concatenate them and return them as results.

The fetch_cached_results function is going to loop over each geolocation backend and get cached results from the Cache module (cached_results) for both, the given query and the current backend module. In case that the current backend module doesn't have cached results for that specific query, is going to be pushed in the uncached_backends list.

The async_query function is the one in charge to get results from geolocation backends by calling its corresponding compute callback (each geolocation backend module must have this callback, we are going to cover it later). In this case, we use an Elixir's Task to perform the request, but more importantly, use the async_nolink function from this module because we want this request to be asynchronous but we don't want it to let the process crash by an HTTP request error, we just simply kill that specific task that errored. Since we are constantly typing in the geolocation search input which fires Locations.get_locations/2 function execution, and also we may have different backend integrations, we don't mind if a single HTTP request crashes, we just kill it because we don't want a whole restarting of the current process caused by this specific crash.

Lastly, the async_query function is the one in charge to put uncached results in the cache. Here we loop over each backend's results, set those results in the cache, and then return the same results. This is a function that simply returns the same that receives, but in the middle performs a side effect, making sure that each backend's result is going to be cached.

Implementing the Backend Module (Backend's Behaviour)

As I mentioned above, we can integrate as many geolocation backends as we wish. But to use them correctly we can create a behavior that each one of them must follow. This module is going to have to callbacks: name and compute. The name/0 callback is going to return the backend's name. The compute/3 is going to return a list of results with the structure of the Result module.

defmodule Carpooling.Locations.Backend do
  @callback name() :: String.t()
  @callback compute(query :: String.t(), point :: String.t(), opts :: Keyword.t()) :: [%Carpooling.Locations.Result{}]
end
Enter fullscreen mode Exit fullscreen mode

Implementing HereMaps Module

Since we are using HereMaps to get geolocations, we implement the module where HTTP requests are made, as well as making sure it follows the Backend's behavior, returning a list of results with the shape of the Result structure.

defmodule Carpooling.Locations.HereMaps do
  alias Carpooling.Locations.Result

  @behaviour Carpooling.Locations.Backend

  @base_discover "https://discover.search.hereapi.com/v1/discover"
  @base_geocode "https://geocode.search.hereapi.com/v1/geocode"

  @impl true
  def name, do: "here_maps"

  @impl true
  def compute(query_str, point, _opts) do
    fetch(query_str, point)
    |> build_results()
  end

  defp fetch(query, point) do
    url(query, point)
    |> HTTPoison.get()
    |> handle_response()
  end

  defp url(query, point) do
    query = URI.encode_query(q: query, apiKey: api_key()) <> "&in=countryCode:COL"

    if String.length(point) > 0 do
      "#{@base_discover}?#{query}&at=#{point}"
    else
      "#{@base_geocode}?#{query}"
    end
  end

  defp api_key, do: Application.fetch_env!(:carpooling, :here_maps)[:apikey]

  defp handle_response({:ok, %{status_code: status_code, body: body}}) do
    {
      status_code |> check_for_error(),
      body |> Poison.Parser.parse!()
    }
  end

  def check_for_error(200), do: :ok
  def check_for_error(_), do: :error

  defp build_results({:ok, %{"items" => items}}) do
    locations =
      items
      |> Enum.map(fn item ->
        %{
          address: item["address"],
          position: item["position"],
          title: item["title"]
        }
      end)

    [%Result{backend: __MODULE__, locations: locations}]
  end

  defp build_results(_response), do: []
end
Enter fullscreen mode Exit fullscreen mode

Implementing the Cache Module

We almost finish but that doesn't mean it's less important. The Cache module plays a pretty important role in our solution. Here we can see some of the Elixir's beauties 😍: GenServer and ETS. This module is going to take advantage of these two fantastic built-in tools.

We create a GenServer to handle put and fetch calls to the ETS table which is the one in charge of persist geolocation results. This GenServer is also going to perform a background job that will clean this table every 600 seconds for this case.

defmodule Carpooling.Locations.Cache do
  use GenServer

  @clear_interval :timer.seconds(600)

  def put(name \\ __MODULE__, key, value) do
    true = :ets.insert(tab_name(name), {key, value})

    :ok
  end

  def fetch(name \\ __MODULE__, key) do
    {:ok, :ets.lookup_element(tab_name(name), key, 2)}
  rescue
    ArgumentError -> :error
  end

  def start_link(opts) do
    opts = Keyword.put_new(opts, :name, __MODULE__)
    GenServer.start_link(__MODULE__, opts, name: opts[:name])
  end

  def init(opts) do
    state = %{
      interval: opts[:clear_interval] || @clear_interval,
      timer: nil,
      table: new_table(opts[:name])
    }

    {:ok, schedule_clear(state)}
  end

  def handle_info(:clear, state) do
    :ets.delete_all_objects(state.table)
    {:noreply, schedule_clear(state)}
  end

  defp schedule_clear(state) do
    %{state | timer: Process.send_after(self(), :clear, state.interval)}
  end

  defp new_table(name) do
    name
    |> tab_name()
    |> :ets.new([
      :set,
      :named_table,
      :public,
      read_concurrency: true,
      write_concurrency: true
    ])
  end

  defp tab_name(name), do: :"#{name}_cache"
end
Enter fullscreen mode Exit fullscreen mode

Once everything is settled, we only need to add the Cache Module and the Task Supervisor (for the async_nolink/5 call) to the App's supervision tree:

...

children = [
  Carpooling.Locations.Cache,
  {Task.Supervisor, name: Carpooling.TaskSupervisor},
  ...
]

...
Enter fullscreen mode Exit fullscreen mode

That's it! Enjoy it!

I hope this can help you someway to build a robust autocomplete search mechanism while enjoying developing with Elixir 💜 and Phoenix LiveView 🧡

To see the full implementation you can visit the repo:

GitHub logo santiagocardo80 / carpooling

Carpolling app to connect workers/students for a colaborative economy

Carpooling

To start your Phoenix server:

  • Install dependencies with mix deps.get
  • Create and migrate your database with mix ecto.setup
  • Install Node.js dependencies with npm install inside the assets directory
  • Start Phoenix endpoint with mix phx.server

Now you can visit localhost:4000 from your browser.

Ready to run in production? Please check our deployment guides.

Learn more

Discussion (0)