Notes on creating a simple JSON API with Phoenix

Feb 3: This is NOT a finished document. It will get updated until further notice.
Feb 8: I still keep adding stuff, but at a much slower pace.
Feb 18: I'm done adding notes :)
Feb 24: I thought I was done, but not yet. :)

The notes that follow are annotations of things I stumbled upon while creating a simple API in Phoenix.

While I do have some experience with Elixir, I have only superficially played around with Phoenix. My experience with Ecto is basically nil.

The API is quite simple: it accepts a CSV document as the body of a request. Each line of the CSV correspond to an operation that must, at a later time, be HTTP POSTed to one of N subservices. The whole file must be ‘valid’ for the post-processing to occur.

If the whole file is valid, a record is created for each line in the CSV file.

Subsequently, each of those records can be queried for a status via a simple GET.

There is a little more to this API, but I will leave it at that for the purpose of this post.

Table of contents

Reading the request body

In the relevant controller:

{:ok, body, conn} = Plug.Conn.read_body(conn, length: 1_000_000_000)

How to POST data with new lines from the command line

I stumbled with this one for a bit while testing the API.

When POSTing with curl, if you need to send new lines, you need to send it as a binary. Otherwise, its ascii and the newlines are removed.

For a while, I thought this was something on Phoenix’s side.

Calling curl with --data-binary instead of -d solved my problems.

Validating a record

I wanted to play in iex with the records. I will eventually need to find if a record is valid or not. There is a basic tutorial on the Ecto documentation, in the Ecto module to be precise.

changeset = MyModel.changeset(%MyModel{}, params)
changeset.valid? # => true/false
changeset.errors # => list of errors

Schema migrations table name clash with Rails

Something I didn’t mention, because I thought it wasn’t important, was that this API would share a DB with a Rails app.

To be honest, it doesn’t matter at all since all the tables in use for this API are unique to it.

The problem is that ecto uses the same table name as Rails for storing transaction information.

There is a way to specify the table name for migrations, but its in Ecto 2.0, which at this moment in time hasn’t been released.

To make matters a little bit worse: MariaEx, the MySQL adapter, is not yet compatible with Ecto 2.0, so upgrading to it is not an option.

My temporary solution that will only work in development mode but allows me to keep moving forward for the time being: modify ecto so that it looks for another table.

Modify the file ecto/lib/ecto/migration/schema_migration.ex, change the two references of “schema_migrations” to whatever you like.

Error ‘(ArgumentError) field/association :foo_bar is already set on schema’

I had a typo the association column in one of my migrations. It was plural when it should have been singular.

Also, at that point I hadn’t added the relationship to the model.

# in the migration
create table(:models) do
  add :related_models_id, :integer

# in the model
schema "model" do
  field :related_models_id, :integer

After fixing the migration and adding the relevant relationship:

# in the migration
create table(:models) do
  add :related_model_id, :integer

# in the model
schema "model" do
  field :related_model_id, :integer
  belongs_to :related_model, RelatedModel

…I got the aforementioned error. Turns out, belongs_to adds the relevant field for the related_model (and other stuff) so it must be removed because it’s redundant.

Always return some JSON

The API generates JSON with a specific format. It should, if possible, always return a JSON document with said format, even if an exception was raised.

I wanted to test this behaviour in development.

We need to disable debugging by commenting out debug_errors or setting it to false.

# in config/dev.exs
config :my_app, MyApp.Endpoint,
  http: [port: 5000],
  # debug_errors: true

Also, you need to define a function that returns the actual JSON in your ErrorView:

# in web/views/error_view.ex
def render("500.json", _assigns) do
  # json here

Always return some JSON except when you really don’t need to

OK, I lied.

I need to try return a JSON document most of the time, but not always. Everything under the json pipeline in the router should return JSON with my preferred format.

One way to limit this JSON to that pipeline is by leveraging the assigns variable that is received by the render function in the view.

assigns contains a conn and it contains path information. So we can pattern match on that:

def render("500.json", %{conn: %{path_info: ["api" | _]}} = _assigns) do
  # json for api goes here

def render("500.json", _assigns) do
  # json for everything else here

Inserting nested models with Ecto

I was having trouble finding a good pattern for organizing validations/inserts of nested models.

Luckily, StackOverflow had the solution, by none other than Jose Valim:

invoice = Invoice.changeset(params["data"], :create)
items   =["includes"], &InvoiceItem.changeset(&1, :create))

if invoice.valid? && Enum.all?(items, & &1.valid?) do
  Repo.transaction fn ->
    invoice = Repo.insert(invoice), fn item ->
      item = Ecto.Changeset.change(item, invoice_id:
  # handle errors

render expects a dict

Like I said before, I always want this API to return a JSON even on exceptions.

There is a case were an error is detected and I have its textual representation. What I want the JSON to have is something like:

{"error_text: "my error text goes here", "error_code": -1}

So, I only needed to pass the error text to the view:

# in the controller
render conn, "bulk-errors.json", my_error_text

But this raises an exception:

** (ArgumentError) unsupported dict: "my error text"

render/3 expects a variable that implements the Dict protocol as it’s third parameter.

Basic Auth for Phoenix

Now, this particular API is for internal use. Even if that’s the case, it’s always better to have some authentication than not to.

I was looking for the most simple way to do primitive authentication without a complex setup.

I found that the basic_auth package solved my problems. It’s code is small and to the point, easy to understand what is going on.

If at some point I need to customize it, I can always use it as a starting point.

After adding it’s dependency to mix.exs, you only need to add it’s plug to the corresponding pipeline, which in my case is ‘/api’:

pipeline :api do
  plug :accepts, ["json"]
  plug BasicAuth, realm: "Admin Area", username: "username", password: "password"

Trying to understand Phoenix.View

If you go to any of the views of your project, specially if it was generated by Phoenix, you will see something like:

defmodule MyApp.ModelView do
  use MyApp.Web, :view

  # this render has an arity of 2
  def render("index.json", %{model: model}) do
    %{data: render_many(model, MyApp.ModelView, "model.json")}

  # ... more code here...

What threw me off here was that the documentation for render/3 in Phoenix.View didn’t match the call in this module, which is a call to render with two parameters.

I was expecting that the functions in my MyApp.ModelView matched the signature of Phoenix.View.render/3. But that’s not the case.

It happens that is the controller who is actually calling your view’s render/2 function behind the scenes via a call to Phoenix.View.render_to_iodata/3.

Doing callbacks with ecto

After creating a record, I want to add to trigger an external action.

My first reaction was to search the Ecto documentation for callbacks. I vaguely remember there being an issue with them. And I wasn’t that wrong: they are being deprecated in Ecto 2.0:
Warning: Ecto callbacks are deprecated.

The proposed solution in Ecto 2.0 will be called Ecto.Multi and will allow you to run a series of operations inside of a transaction, including what are now called “callbacks”.

Since my API is quite simple, I decided to not use a callback and simply call the function after the DB operations are finished.

But at least now I know what the future path will be regarding callbacks and ecto.

Background jobs with Exq

If you remember, our API must communicate to other services via POST.

This will be done in the background.

Some of our other apps use Resque or Sidekiq. We like having an admin for the people in operations to monitor. Exq offers that plus a bunch of features that are useful, like throttling and scheduling. Plus, we already have redis running.

Exq is a job processing package that is compatible with Resque and Sidekiq.

While doing background job processing in Elixir without any packages might be trivial I don’t think it’s worth it, unless you want to do it for the pleasure of learning. There are many features that one might not implement that will come back and kick you in the butt when you least expect them like job persistance, retries, throttling, failure handling, etc.

Making iex/elixir treat warnings as errors

I was playing with Behaviours and I wanted iex/elixir to complain loudly if I didn’t implement a function for a behaviour.

Turns out that elixir, at least when running via iex -S mix phoenix.server warns you about the missing functions, but the compilation will succeed. If you are not looking for it, you might miss it.

warning: undefined behaviour function query_status/1

You can force iex to fail compiling on warnings by calling it like so:

iex -S mix phoenix.server --erl "--warnings-as-errors"

POSTing data with HTTPoison and parsing with Poison

I like HTTPoison for my HTTP needs. Its concise and it works. What I don’t like is that I keep forgetting how to post data with it. And the docs don’t are not very clear to me in that point.

So, here’s how:

hackney = [basic_auth: {"user", "password"}]
headers = %{"Content-Type" => "application/json", "Accept" => "application/json"}!(@my_url, json_body, headers, [hackney: hackney])
|> Map.fetch!(:body)
|> :unicode.characters_to_binary(:latin1)
|> Poison.decode!

I had problems with one of the APIs failing because of unicode issues. The answer was to call the function you see there.

Printing the generated response before sending it to the client

This API is being used by a third party that, since they are in development phase, constantly asks me what response I send them.

The problem is, some of the operations change with time.

So I can’t recreate the generated response easily.

I need a way to log the response I send them.

Luckily, there is a way to do that. And yes, it’s done with a plug.

# in router.ex..

# define a method
def my_handler(conn, _opts) do
  Plug.Conn.register_before_send(conn, fn conn ->
    IO.puts "/== start generated response =="
    IO.puts conn.resp_body
    IO.puts "\== end generated response =="

# add that method name to your respective pipeline
pipeline :api do
  plug :accepts, ["json"]
  plug :my_handler