About
Do you ever wonder what is going on inside your Phoenix application? Struggle to remember exactly how that data is processed? Llama Logs can bring an end to these problems.
Llama Logs is a brand new app that can automatically create real time interactive graphs of the activity within your Phoenix application. Llama Logs also seamlessly works with JS, Go, and Python. It is ready for you to start using in your application for free today!
Below I will show you how to add Llama Logs to an example Phoenix application and visualize whats going on in the server.
Example App
For this post, I will set up and use a new Phoenix API application. This is a very simple Phoenix app. It will have only one resource, Projects. Projects will have 3 attributes. The type attribute will either be "Internal" or "External".
schema "projects" do
field :name, :string
field :status, :string
field :type, :string
timestamps()
end
Adding Llama Logs
Llama Logs is very simple to add to an application. There is a full getting started guide here: https://llamalogs.com/docs/getting-started
Once Llama Logs is initialized, just a few log statements need to be added to generate a real time visual graph.
For this application I thought it would be cool to visualize the incoming traffic based on whether the projects were internal or external. This can be accomplished by adding Llama Log statements into the controller.
Here is an example of the changes to the create function in the project_controller.ex file:
def create(conn, %{"project" => project_params}) do
with {:ok, %Project{} = project} <- HelloProjects.create_project(project_params) do
LlamaLogs.log(%{sender: "User", receiver: project.type})
LlamaLogs.log(%{sender: project.type, receiver: "#{project.type} Create"})
LlamaLogs.log(%{sender: "#{project.type} Create", receiver: "Database"})
conn
|> put_status(:created)
|> put_resp_header("location", Routes.project_path(conn, :show, project))
|> render("show.json", project: project)
end
Note that the values used for sender and receiver are evaluated dynamically. This makes it possible to use one statement for multiple data paths.
Llama Logs Graph
The Llama Logs client will automatically aggregate and send the data from those statements into the Llama Logs servers. Once there, a graph of the information is created in real time to show the ongoing activity in the server.
The graph below shows the activity within the example project with users creating, viewing, and updating their project models.
Each component in the graph is auto generated based on the strings in your Llama Log statements.
Each shape moving between components represents multiple events. This way, up to millions of events can be visualized simultaneously. The key of what each shape represents is shown below.
Errors
The key shows error logs as well. This is a neat feature of Llama Logs that lets you dynamically catch and visualize errors in your Phoenix application.
With these lines added to the example app, errors during the project creation call can be caught. Note the is_error
field in the third statement.
def create(conn, %{"project" => project_params}) do
with {:ok, %Project{} = project} <- HelloProjects.create_project(project_params) do
...
else
result ->
LlamaLogs.log(%{sender: "User", receiver: project_params["type"]})
LlamaLogs.log(%{sender: project_params["type"], receiver: "#{project_params["type"]} Create"})
LlamaLogs.log(%{sender: "#{project_params["type"]} Create", receiver: "Database", is_error: true})
conn
|> put_status(:bad_request)
|> render("error.json", %{})
end
end
I set the name field on a project to be unique, so errors can be triggered by creating a project with a pre existing name.
And then below we can see in real time the errors that I set up while creating external projects.
Resources
All the code used for this example Phoenix app can be found in this repo:
https://github.com/llamalogs/PhonixBlogExample
More information on Llama Logs and the Elixir/Phoenix developer guide can be found at:
https://llamalogs.com/docs
Please reach out if you have any questions or need help getting Llama Logs set up in your Phoenix application. I can be reached at andrew@llamalogs.com
Top comments (0)