Function Calling JSON serialization function dispatch

also: Tool Use · OpenAI Functions · Tool Calling

patterns app-dev

Function calling is just JSON serialization and function dispatch with extra steps

Under the hood

  • JSON serialization Converting data structures to and from the JSON text format for storage or transmission. Wikipedia ↗
  • function dispatch Calling a function by name at runtime — the same mechanism as any callback system. Wikipedia ↗

What they say

Function calling lets AI models “interact with external systems,” “execute real-world actions,” and “access up-to-date information.” It’s presented as a breakthrough capability where the model can “decide” to call your functions.

What it actually is

You send the LLM a list of function signatures as JSON Schema.1 The LLM responds with a JSON object containing the function name and arguments. You deserialize the JSON, look up the function in a dispatch table, call it, and send the result back.

The pattern in pseudocode

// 1. Define tools as JSON Schema
const tools = [{
  name: "get_weather",
  parameters: {
    type: "object",
    properties: { city: { type: "string" } }
  }
}];

// 2. LLM returns structured JSON
const response = await llm.chat({ messages, tools });
// response.tool_calls = [{ name: "get_weather", arguments: '{"city":"Chicago"}' }]

// 3. Dispatch table — just a lookup
const dispatch: Record<string, Function> = {
  get_weather: (args) => weatherApi.get(args.city),
};

// 4. Deserialize and call
const call = response.tool_calls[0];
const result = await dispatch[call.name](JSON.parse(call.arguments));

The “extra steps”

  1. Schema definition — describing your functions as JSON Schema (interface definition)
  2. Structured output — the LLM formats its “decision” as valid JSON (constrained decoding)2
  3. Parallel tool calls — the LLM can request multiple calls at once (batch dispatch)
  4. Forced tool use — making the LLM always call a specific function (removing the conditional)

What you already know

If you’ve parsed a webhook payload and called a different function based on event.type, you understand function calling. The only difference is the thing sending the payload is an LLM that decided which function to invoke from reading natural language.

// webhook handler you've written before
if (event.type === 'payment.succeeded') await handlePayment(event.data);
if (event.type === 'user.created')      await sendWelcomeEmail(event.data);

// function calling — same shape, different sender
if (call.name === 'get_weather')  result = await getWeather(call.arguments);
if (call.name === 'send_email')   result = await sendEmail(call.arguments);

The LLM is just a new kind of caller.3

Footnotes

  1. JSON Schema — Wikipedia — what you’re actually writing when you define tool parameters. Understanding type, properties, and required covers 90% of real tool definitions. Both OpenAI and Anthropic use this as their tool description format.

  2. Function (computer programming) — Wikipedia — the dispatch table is literally just an object where keys are function names and values are function references. “Dynamic dispatch” is the CS term for what the LLM triggers.

  3. OpenAI introduced function calling as a named feature in June 2023. The tools array in the request is the interface definition; tool_calls in the response is the dispatch signal. Anthropic’s tool use uses the same concept with different field names.