In the post on AI agents with Ollama, I hacked together a home-grown tool-calling system: a TOOL: name format parsed by hand, a tool registry, a ReAct loop. It worked, but it was throwaway code. Every new AI client was going to need a different integration.
MCP (Model Context Protocol) is the standardised answer to that problem. A common contract between AI clients (Claude Desktop, VS Code Copilot, Cursor…) and tool servers. Write a server once, plug it in everywhere.
In this post: building a minimal MCP server in C#, with a concrete weather example (public API, no key required), and plugging it into Claude Desktop.
What MCP Solves
Without MCP, every tool integration is ad hoc:
- OpenAI has its function calling format
- Ollama has its own (OpenAI-compatible)
- Every DIY agent has its own
MCP defines a standard protocol (JSON-RPC 2.0): the server exposes tools, the client discovers and calls them, the LLM decides when to use them.
Three server-side primitives: Tools (callable functions), Resources (data injected into context), Prompts (reusable templates). This post covers Tools, the most direct use case.
The C# SDK (ModelContextProtocol)
There’s an official SDK, co-maintained by Anthropic and Microsoft:
dotnet add package ModelContextProtocol --prerelease
dotnet add package Microsoft.Extensions.Hosting
dotnet add package Microsoft.Extensions.Http
Important note: it’s in preview (0.9.0-preview.1 at time of writing). The API may change before 1.0. I pin the version in the .csproj to avoid surprises.
The Example: Weather via Open-Meteo
I chose Open-Meteo: a public, free weather API, no key required. That lets me stay focused on MCP rather than authentication plumbing.
The full code is available here: mongeon/code-examples · dotnet/mcp/mcp-weather-server
Project Structure
mcp-weather-server/
mcp-weather-server.csproj
Program.cs
WeatherTools.cs
WeatherService.cs
GeocodingResponse.cs
OpenMeteoResponse.cs
WeatherResult.cs
Program.cs
var builder = Host.CreateApplicationBuilder(args);
// Logs go to stderr, stdout is reserved for the MCP transport (JSON-RPC)
builder.Logging.AddConsole(opts =>
{
opts.LogToStandardErrorThreshold = LogLevel.Trace;
});
builder.Services.AddHttpClient<WeatherService>();
builder.Services
.AddMcpServer()
.WithStdioServerTransport()
.WithToolsFromAssembly();
await builder.Build().RunAsync();
The WithStdioServerTransport() is the core: the AI client launches the server as a child process and communicates over stdin/stdout using JSON-RPC 2.0. No network port to configure.
WithToolsFromAssembly() auto-discovers methods marked [McpServerTool] in the assembly.
WeatherService.cs
public class WeatherService(HttpClient httpClient)
{
public async Task<WeatherResult?> GetCurrentWeatherAsync(string city)
{
// Step 1: geocoding
var geoUrl = $"https://geocoding-api.open-meteo.com/v1/search" +
$"?name={Uri.EscapeDataString(city)}&count=1&format=json";
var geoResponse = await httpClient.GetFromJsonAsync<GeocodingResponse>(geoUrl);
var location = geoResponse?.Results?.FirstOrDefault();
if (location is null) return null;
// Step 2: current weather
var weatherUrl = $"https://api.open-meteo.com/v1/forecast" +
$"?latitude={location.Latitude}&longitude={location.Longitude}" +
"¤t=temperature_2m,apparent_temperature," +
"weathercode,windspeed_10m,relative_humidity_2m&timezone=auto";
var weather = await httpClient.GetFromJsonAsync<OpenMeteoResponse>(weatherUrl);
if (weather?.Current is null) return null;
return new WeatherResult(
City: location.Name,
Country: location.Country,
Temperature: weather.Current.Temperature,
ApparentTemperature: weather.Current.ApparentTemperature,
Humidity: weather.Current.RelativeHumidity,
WindSpeed: weather.Current.WindSpeed,
Condition: WmoCodeToDescription(weather.Current.WeatherCode)
);
}
private static string WmoCodeToDescription(int code) => code switch
{
0 => "Clear sky",
1 => "Mainly clear",
2 => "Partly cloudy",
3 => "Overcast",
45 or 48 => "Fog",
51 or 53 or 55 => "Drizzle",
61 or 63 or 65 => "Rain",
71 or 73 or 75 => "Snow",
80 or 81 or 82 => "Rain showers",
95 => "Thunderstorm",
_ => "Unknown"
};
}
The deserialization models (GeocodingResponse, OpenMeteoResponse, WeatherResult, etc.) are in the full repo.
WeatherTools.cs
[McpServerToolType]
public static class WeatherTools
{
[McpServerTool]
[Description("Gets current weather conditions for a city using the Open-Meteo API.")]
public static async Task<string> GetCurrentWeather(
WeatherService weatherService,
[Description("City name (e.g. 'Montreal', 'Paris', 'London')")]
string city)
{
var weather = await weatherService.GetCurrentWeatherAsync(city);
if (weather is null)
return $"Could not find weather data for '{city}'. Check the city name and try again.";
return $"""
Weather for {weather.City}, {weather.Country}:
- Condition: {weather.Condition}
- Temperature: {weather.Temperature}°C (feels like {weather.ApparentTemperature}°C)
- Humidity: {weather.Humidity}%
- Wind: {weather.WindSpeed} km/h
""";
}
}
Two notes on the signature:
WeatherService weatherService, the SDK injects it from the DI container. No[FromServices], no manual wiring.- The
[Description]on thecityparameter, that’s what the LLM reads to know what to pass. Be precise here.
Connecting to Claude Desktop
The config lives in a JSON file:
- Windows:
%APPDATA%\Claude\claude_desktop_config.json - macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"weather": {
"command": "dotnet",
"args": ["run", "--project", "C:\\path\\to\\mcp-weather-server"]
}
}
}
Or if you publish a standalone executable:
{
"mcpServers": {
"weather": {
"command": "C:\\path\\to\\mcp-weather-server.exe"
}
}
}
Restart Claude Desktop. A tool icon appears in the input bar. Ask: “What’s the weather in Montreal?”, Claude calls GetCurrentWeather, gets the response, and formulates an answer.
VS Code Copilot (Bonus)
In .vscode/mcp.json at the workspace root:
{
"servers": {
"weather": {
"type": "stdio",
"command": "dotnet",
"args": ["run", "--project", "${workspaceFolder}/mcp-weather-server"]
}
}
}
Enable Agent mode in Copilot Chat, the server is available.
One Server, Multiple Clients
This is where MCP changes things compared to the ad hoc tool-calling from the agents post: the same binary, unmodified, works with Claude Desktop, VS Code Copilot, Cursor, and JetBrains IDEs.
The business logic (calling Open-Meteo) lives in the server. The protocol is standard. The client doesn’t need to know how the tools work, just that they exist and what they do.
What About Ollama?
Ollama doesn’t support MCP natively, there’s an open issue from November 2024 that’s still not closed. Ollama has its own tool-calling format (OpenAI-compatible), which is a different protocol.
Community Python bridges exist to connect Ollama to MCP servers, but nothing official or stable.
For now: MCP is the territory of Claude Desktop and IDEs. Ollama is the territory of explicitly coded local agents. Both coexist, they address different needs.
Things I Wish I’d Known
A few points that would have saved me time:
- Logs go to stderr, not stdout. A
Console.WriteLineinside a tool breaks the transport. Everything on stdout is JSON-RPC. Everything else goes to stderr, hence theLogToStandardErrorThresholdinProgram.cs. - Restart Claude Desktop after every change to
claude_desktop_config.json. No hot reload. - The
[Description]matters. That’s what the LLM reads to decide if the tool is relevant. A vague description = an underused tool. - Pin the SDK version. It’s in preview. An uncontrolled update can change the API.
Resources
- Full code: mongeon/code-examples
- Official C# SDK (GitHub)
- ModelContextProtocol on NuGet
- MCP Spec
- Open-Meteo API
- .NET Blog: Build a MCP server in C#
Happy tool building (and keep stdout clean).