6

I'm using FastAPI to serve ML models. My endpoint receives and sends JSON data of the form:

[
  {"id": 1, "data": [{"code": "foo", "value": 0.1}, {"code": "bar", "value": 0.2}, ...]},
  {"id": 2, "data": [{"code": "baz", "value": 0.3}, {"code": "foo", "value": 0.4}, ...]},
  ...
]

My models and app look as follows:

from typing import Dict, List
 
from fastapi import Body
from fastapi.responses import JSONResponse
from pydantic import BaseModel
import pandas as pd


class Item(BaseModel):
    code: str
    value: float


class Sample(BaseModel):
    id: int
    data: List[Item]


app = FastAPI()


@app.post("/score", response_model=List[Sample])  # correct response documentation
def score(input_data: List[Sample] = Body(...)):  # 1. conversion dict -> Pydantic models, slow
    input_df: pd.DataFrame = models_to_df(input_data)  # 2. conversion Pydantic models -> df

    output_df: pd.DataFrame = predict(input_df)

    output_data: Dict = df_to_dict(output_df)  # direct conversion df -> dict, fast
    return JSONResponse(output_data)

Everything works fine and the automated documentation looks good, but the performance is bad. Since the data can be quite large, Pydantic conversion and validation can take a lot of time.

This can easily be solved by writing direct conversion functions between JSON data and data frames, skipping the intermediary representation of Pydantic models. This is what I did for the response, achieving a 10x speedup, at the same time preserving the automated API documentation with the response_model=List[Sample] argument.

I would like to achieve the same with the request: being able to use custom JSON input parsing, while at the same time preserving API documentation using Pydantic models. Sadly I can't find a way to do it in the FastAPI docs. How can I accomplish this?

1 Answer 1

7

You can always accept the raw request, load the request.body() data as bytes and do your own decoding. The schema of the request body should then be documented as a (partial) raw OpenAPI Operation structure using the openapi_extra argument to the @app.post() decorator:

@app.post(
    "/score",
    response_model=List[Sample],
    openapi_extra={
        "requestBody": {
            "content": {
                "application/json": {
                    "schema": {
                        "type": "array",
                        "items": Sample.schema(ref_template="#/components/schemas/{model}"),
                    }
                }
            }
        }
    },
)
async def score(request: Request):
    raw_body = await request.body()
    # parse the `raw_body` request data (bytes) into your DF directly.

The openapi_extra structure is merged into the operation structure generated from other components (such as the response_model). I used your existing Sample model here to provide the schema for the array items, but you can also map out the whole schema manually.

Instead of using the raw bytes of the body, you could also delegate parsing as JSON to the request object:

data = await request.json()

If there is a way to parse the data as a stream (pushing chunks to a parser), you could avoid the memory overhead of loading the whole body at once by treating the request as a stream in an async loop:

parser = ...  # something that can be fed chunks of data
async for chunk in request.stream():
    parser.feed(chunk)

This is documented in the Custom OpenAPI path operation schema section in the Advanced User Guide. The same section also covers Us[ing] the Request object directly, and the various options for handling the Request body can be found in the Starlette Request class documentation.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks a lot for this detailed answer, that's exactly what I needed! I just had to add a ref_template="#/components/schemas/{model}" argument to the .schema() method in your code sample to have correct openapi references. I suggested an edit to your answer.
@Macfli: darn, I had made a mental note to verify the ref_template argument but other priorities interfered. Edit applied!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.