Server Sent Events and Streaming APIs

Use the x-fern-streaming extension to model streaming endpoints

The x-fern-streaming extension allows you to represent endpoints that are streaming.

JSON streaming

If you’re API returns a series of JSON chunks as seen below

1{ "text": "Hi, I am a" }
2{ "text": "chatbot. Do you have any"}
3{ "text": "questions for me"}

then simply add the x-fern-streaming: true to your OpenAPI operation.

openapi.yml
1paths:
2 /logs:
3 post:
4 x-fern-streaming: true
5 responses:
6 "200":
7 content:
8 application/json:
9 schema:
10 $ref: "#/components/schemas/Chat"
11components:
12 schemas:
13 Chat:
14 type: object
15 properties:
16 text:
17 type: string

Server sent events

If your API returns server-sent-events, with the data and event keys as seen below

1data: { "text": "Hi, I am a" }
2data: { "text": "chatbot. Do you have any"}
3data: { "text": "questions for me"}

then make sure to include format: sse.

openapi.yml
1paths:
2 /logs:
3 post:
4 x-fern-streaming:
5 format: sse
6 responses:
7 "200":
8 content:
9 application/json:
10 schema:
11 $ref: "#/components/schemas/Chat"
12components:
13 schemas:
14 Chat:
15 type: object
16 properties:
17 text:
18 type: string

Stream parameter

It has become common practice for endpoints to have a stream parameter that controls whether the response is streamed or not. Fern supports this pattern in a first class way.

Simply specify the stream-condition as well as the ordinary response and the streaming response:

openapi.yml
1paths:
2 /logs:
3 post:
4 x-fern-streaming:
5 format: sse
6 stream-condition: $request.stream
7 response:
8 $ref: '#/components/schemas/Chat'
9 response-stream:
10 $ref: '#/components/schemas/ChatChunk'
11components:
12 schemas:
13 Chat:
14 type: object
15 properties:
16 text:
17 type: string
18 tokens:
19 type: number
20 ChatChunk:
21 type: object
22 properties:
23 text:
24 type: string