The OpenAI .NET SDK (OpenAI NuGet package, v2.9.1) currently ships all feature areas in a single
monolithic package. This document proposes splitting it into discrete per-feature packages following
the naming pattern OpenAI.<Feature>, linked together by a top-level OpenAI metapackage.
The following table correlates the SDK namespaces, client types, and public type counts. All sources (REST spec, SDK surface, platform docs) agree on the feature area segmentation.
| Feature Area | SDK Namespace | Client Type | Public Types |
|---|---|---|---|
| Responses | OpenAI.Responses |
ResponsesClient |
136 |
| Chat | OpenAI.Chat |
ChatClient |
46 |
| Realtime | OpenAI.Realtime |
RealtimeClientRealtimeSessionClient |
138 |
| Audio | OpenAI.Audio |
AudioClient |
27 |
| Images | OpenAI.Images |
ImageClient |
11 |
| Embeddings | OpenAI.Embeddings |
EmbeddingClient |
7 |
| Files | OpenAI.Files |
OpenAIFileClient |
8 |
| VectorStores | OpenAI.VectorStores |
VectorStoreClient |
19 |
| Models | OpenAI.Models |
OpenAIModelClient |
6 |
| Moderations | OpenAI.Moderations |
ModerationClient |
9 |
| FineTuning | OpenAI.FineTuning |
FineTuningClient |
21 |
| Batch | OpenAI.Batch |
BatchClient |
5 |
| Evals + Graders | OpenAI.EvalsOpenAI.Graders |
EvaluationClientGraderClient |
19 |
| Containers | OpenAI.Containers |
ContainerClient |
12 |
| Conversations | OpenAI.Conversations |
ConversationClient |
2 |
| Assistants | OpenAI.Assistants |
AssistantClient |
68 |
| Videos | OpenAI.Videos |
VideoClient |
2 |
| Skills | (not yet in SDK) | — | — |
| Administration | (not yet in SDK) | — | — |
Key observations:
-
Graders has its own SDK namespace and client (
GraderClient) but shares a REST spec file with Evals. They will be merged into a singleOpenAI.Evalspackage. -
Skills and Administration exist in the REST spec but are not yet implemented in the SDK.
-
Assistants (62 types) and Videos (1 type) are announced deprecated for 2026.
-
Four features are "completion APIs" with heavy conceptual overlap (Assistants, Chat, Responses, Realtime), each defining its own messages, tools, and streaming events independently.
graph TD
T3["<b>OpenAI</b><br/><i>Metapackage</i>"]
T3 --> Resp["OpenAI.Responses"]
T3 --> Chat["OpenAI.Chat"]
T3 --> RT["OpenAI.Realtime"]
T3 --> Files["OpenAI.Files"]
T3 --> More["... 13 more Tier 2 packages"]
Resp --> T1
Chat --> T1
RT --> T1
Files --> T1
More --> T1
T1["<b>OpenAI.Core ?</b><br/><i>Foundation</i>"]
The existing OpenAI package name becomes a metapackage that references all Tier 2 packages.
Contains:
OpenAIClient(factory class withGet*Client()methods for every feature)- PackageReference to every feature package
This ensures existing users who Install-Package OpenAI continue to get everything.
Each package contains one feature area's client, models, and streaming types.
| Package | Namespace(s) | Client(s) | Types |
|---|---|---|---|
OpenAI.Responses |
OpenAI.Responses |
ResponsesClient |
136 |
OpenAI.Chat |
OpenAI.Chat |
ChatClient |
46 |
OpenAI.Realtime |
OpenAI.Realtime |
RealtimeClientRealtimeSessionClient |
138 |
OpenAI.Audio |
OpenAI.Audio |
AudioClient |
27 |
OpenAI.Images |
OpenAI.Images |
ImageClient |
11 |
OpenAI.Embeddings |
OpenAI.Embeddings |
EmbeddingClient |
7 |
OpenAI.Files |
OpenAI.Files |
OpenAIFileClient |
8 |
OpenAI.VectorStores |
OpenAI.VectorStores |
VectorStoreClient |
19 |
OpenAI.Models |
OpenAI.Models |
OpenAIModelClient |
6 |
OpenAI.Moderations |
OpenAI.Moderations |
ModerationClient |
9 |
OpenAI.FineTuning |
OpenAI.FineTuning |
FineTuningClient |
21 |
OpenAI.Batch |
OpenAI.Batch |
BatchClient |
5 |
OpenAI.Evals |
OpenAI.EvalsOpenAI.Graders |
EvaluationClientGraderClient |
19 |
OpenAI.Containers |
OpenAI.Containers |
ContainerClient |
12 |
OpenAI.Conversations |
OpenAI.Conversations |
ConversationClient |
2 |
OpenAI.Assistants |
OpenAI.Assistants |
AssistantClient |
68 |
OpenAI.Videos |
OpenAI.Videos |
VideoClient |
2 |
17 feature packages total.
Notes:
-
No feature package takes a NuGet dependency on another feature package. All cross-feature relationships are developer workflow dependencies: the developer must install companion packages to prepare resources (e.g., uploading files, creating vector stores) but no .NET types are imported across package boundaries.
-
Each package will have the model factory, serialization context, DI extensions, and JSON schema definition related to the types in that package.
-
Evals is a notable case: while the eval service internally calls the Responses and Chat APIs, the SDK types are self-contained.
CreateEvalResponsesRunDataSourceandCreateEvalCompletionsRunDataSourceare defined entirely within the Evals spec and accept configuration via primitive types. No .NET types fromOpenAI.ResponsesorOpenAI.Chatare imported.
⚠ Open question: Whether
OpenAI.Coreshould exist is under consideration. The alternative is to duplicateOpenAIClientOptionsor eliminate the shared type. See the Type Analysis section for details.
A minimal shared foundation package. All feature packages would depend on this.
Would contain (public types):
OpenAIClientOptions: the only verified public OpenAI-authored type referenced across all feature namespaces. Every feature client constructor accepts it:public ChatClient(string model, ApiKeyCredential credential, OpenAIClientOptions options); public ResponsesClient(ApiKeyCredential credential, OpenAIClientOptions options); // ... same pattern for all clients
- Future home for consolidated shared types (see Type Analysis section)
Internal types are shared as source, not via Core. All internal utilities, models,
and infrastructure types are shared via linked source files (<Compile Include="..">). Each
feature assembly compiles its own copy (e.g., Argument, BinaryContentHelper,
ModelSerializationExtensions, ...).
| Package | Status | Notes |
|---|---|---|
OpenAI.Skills |
In development | REST spec exists; used for agentic skill execution via Responses |
OpenAI.Administration |
Not started | REST spec exists (67 operations) |
All packages (Core, feature packages, metapackage) share coordinated version numbers.
These are dependencies identified from the REST API specifications and platform documentation where one feature area's API references resources managed by another feature area. They represent developer workflow connections, not compile-time type dependencies. No feature package imports .NET types from another feature package.
-
ID-reference: Feature A's API accepts IDs (e.g.,
file_id,vector_store_id) that belong to Feature B. Users need Feature B's client to create/manage those resources. -
Output: Feature A's API produces resources (e.g., generated files) that belong to Feature B. Users need Feature B's client to download/access those outputs.
-
Tool: Feature A defines built-in tools (e.g.,
file_search,code_interpreter) whose configuration references resources from Feature B. The tool types themselves (e.g.,FileSearchTool) are defined within Feature A's package and accept resource IDs as strings, so no .NET type from Feature B is imported. The developer must use Feature B's API to create the referenced resources before invoking the tool. -
Endpoint: Feature A targets Feature B's API endpoint server-side. The SDK types are self-contained within Feature A.
Dependencies are broken down by direction: input (what you need to create before calling this feature) and output (what the service produces that requires another feature to access). Output sections are omitted when a feature produces no external assets.
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference + Tool | file_id in request parameters; files uploaded via Files API |
| VectorStores | Tool | vector_store_ids in File Search tool; stores created via VectorStores API |
| Containers | ID-reference + Tool | container_id in Code Interpreter / Shell Tool; containers created via Containers API |
| Conversations | ID-reference | conversation_id in request parameters; conversations created via Conversations API |
Output dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | Output | Code Interpreter produces files; file_id must be downloaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | file_id in request parameters; files uploaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | file_id in request parameters; files uploaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | file_id in eval data sources; files uploaded via Files API |
| Containers | ID-reference | container_id in Code Interpreter config; containers created via Containers API |
| VectorStores | Tool | vector_store_ids in File Search config; stores created via VectorStores API |
| Responses | Endpoint | CreateEvalResponsesRunDataSource targets the Responses API endpoint |
| Chat | Endpoint | CreateEvalCompletionsRunDataSource targets the Chat API endpoint |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | input_file_id in request parameters; JSONL uploaded via Files API |
Output dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | Output | output_file_id, error_file_id must be downloaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | training_file and validation_file in request parameters; data uploaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | file_id in request parameters; files uploaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | file_ids in request parameters; files uploaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference + Tool | file_id in request parameters; files uploaded via Files API |
| VectorStores | Tool | vector_store_ids in File Search config; stores created via VectorStores API |
Output dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | Output | Code Interpreter produces files; file_id must be downloaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | file_id in image edit parameters; source images and masks uploaded via Files API |
Input dependencies:
| Depends On | Nature | Evidence |
|---|---|---|
| Files | ID-reference | file_id in citations and item content; files uploaded via Files API |
| Containers | ID-reference | container_id in citations; containers created via Containers API |
These features have no service-level dependencies on other features:
| Package | Notes |
|---|---|
OpenAI.Audio |
Speech generation, transcription, translation |
OpenAI.Embeddings |
Embedding generation |
OpenAI.Models |
Model listing/info |
OpenAI.Moderations |
Content moderation |
OpenAI.Videos |
Video generation (deprecated) |
| Feature | Depended On By | Count |
|---|---|---|
| Files | Responses, Chat, Realtime, Assistants, Evals, FineTuning, Batch, VectorStores, Containers, Images, Conversations | 11 |
| VectorStores | Responses, Assistants, Evals | 3 |
| Containers | Responses, Evals, Conversations | 3 |
| Responses | Evals | 1 |
| Chat | Evals | 1 |
| Conversations | Responses | 1 |
Files is the most interconnected feature: 11 of 17 features reference file IDs or produce
files, making OpenAI.Files a near-universal companion package.
Some scenarios create transitive developer workflow chains:
Responses (File Search) → VectorStores → Files
Responses (Code Interpreter) → Containers → Files
Evals → Responses → (any Responses tool deps)
Batch → Files (always, both input and output)
The longest chain is Evals → Responses → VectorStores → Files (4 packages deep).
These are NOT NuGet transitive dependencies (packages don't reference each other). They are developer workflow chains. The developer must install each package in the chain to accomplish their scenario.
These illustrate which packages a developer needs for typical workflows.
| Scenario | Packages Needed |
|---|---|
| Simple chatbot Basic request/response text generation using the Chat completions API |
OpenAI.Chat |
| Text generation with Responses Text generation using the primary Responses API, including web search and function calling |
OpenAI.Responses |
| RAG with Responses Retrieval-augmented generation using File Search over uploaded documents in vector stores |
OpenAI.ResponsesOpenAI.VectorStoresOpenAI.Files |
| Agent with Code Interpreter Agentic workflow where the model writes and executes code in a sandboxed container |
OpenAI.ResponsesOpenAI.ContainersOpenAI.Files |
| Full agent (RAG + Code Interpreter) Combined RAG and code execution, the most common full-featured agent scenario |
OpenAI.ResponsesOpenAI.VectorStoresOpenAI.ContainersOpenAI.Files |
| Agent with conversations Stateful multi-turn agent using conversation persistence across requests |
Above +OpenAI.Conversations |
| Real-time voice Bidirectional audio streaming for voice assistants and real-time interactions |
OpenAI.Realtime |
| Embeddings for search Generating vector embeddings for semantic search or clustering |
OpenAI.Embeddings |
| Image generation Creating or editing images via DALL·E models |
OpenAI.Images |
| Fine-tuning a model Training a custom model on uploaded datasets |
OpenAI.FineTuningOpenAI.Files |
| Batch processing Submitting bulk jobs (JSONL) targeting any supported API endpoint |
OpenAI.BatchOpenAI.Files+ target feature |
| Running evals Evaluating model outputs using configurable graders and datasets |
OpenAI.EvalsOpenAI.Files+ target feature |
| Content moderation Classifying text for policy violations |
OpenAI.Moderations |
| Speech generation Text-to-speech audio output |
OpenAI.Audio |
| Everything (current behavior) Install the metapackage for full access to all features |
OpenAI (metapackage) |
Simple scenarios (1 package): Chat, basic Responses, Embeddings, Images, Audio, Moderations, Models, all fully self-contained. Developers install one feature package and go. This covers the most common entry points.
Moderate scenarios (2–3 packages): RAG, Code Interpreter, Fine-tuning, Batch all require Files plus the primary feature. Developers need to know to install companion packages. Clear documentation and IDE suggestions can mitigate this.
Complex agent scenarios (4–5 packages): Full agents with RAG + Code Interpreter + Conversations require 4–5 packages. This is the worst case for package discovery friction, but these developers are power users who are comfortable managing dependencies.
Escape hatch: The OpenAI metapackage always provides the "install everything" option,
matching today's behavior. New users and those who don't want to think about package granularity
can use this.
A key design question for the split is which types should live in OpenAI.Core (shared binary)
versus being duplicated per feature package (or shared as source).
These are public types that appear in the API surface of multiple feature packages. They cannot be duplicated because consumers need a single type identity across packages.
| Type | Current Namespace | Referenced By | Rationale |
|---|---|---|---|
OpenAIClientOptions |
OpenAI |
Every feature client constructor | Cannot be duplicated; a consumer passing options to multiple clients needs one type or clients would have options types with distinct names |
This is currently the only type that definitively requires Core. Whether this justifies a dedicated package or should be handled differently is an open question.
Issue #1069 identifies duplicate enums that exist across feature areas with identical semantics. Unlike the completion API types (which are intentionally independent per feature), these represent the same concept and should potentially be consolidated.
| Candidate Type | Currently In | Also Referenced By | Consolidation Impact |
|---|---|---|---|
ImageGenerationToolQuality / GeneratedImageQuality |
OpenAI.Responses(via ImageGenerationTool) |
OpenAI.Images (via ImageGenerationOptions) |
Same quality values (e.g., hd, standard) used in both; different type names |
ImageGenerationToolSize / GeneratedImageSize |
OpenAI.Responses |
OpenAI.Images |
Same size values used in both; different type names |
ImageGenerationToolOutputFileFormat / GeneratedImageFileFormat |
OpenAI.Responses |
OpenAI.Images |
Same formats (jpeg, png, webp); different type names |
DataOrLocation (pending discussion) |
— | Images, Audio, and other features with file-or-data output. | Allows either BinaryData or a Uri pointer for generated output |
If consolidated into Core, these types would be shared across packages via binary reference.
For example, a consumer using both OpenAI.Responses and OpenAI.Images would get a single
image quality type from Core rather than seeing ImageGenerationToolQuality and
GeneratedImageQuality as separate types.
If left duplicated, each package defines its own copy. This is simpler for the package split but means consumers working across both features see two identical-looking types.
Recommendation: Resolve issue #1069 before or during the split. If the enums are consolidated, Core grows from 1 type to ~5+ types, making it more substantial. If they remain duplicated, document the intentional duplication.
-
Core package existence: Should
OpenAI.Coreexist? Currently onlyOpenAIClientOptionsdefinitively requires it. If issue #1069 consolidates shared enums andDataOrLocationis added, Core becomes more substantial. Otherwise, alternatives (e.g., duplicating the type or using a different sharing mechanism) may be preferable. -
Skills package: In development; type surface and dependencies unknown.
| Question | Decision |
|---|---|
| Versioning | Coordinated: all packages share the same version number |
| Evals/Graders | Merge: single OpenAI.Evals package containing both namespaces |
| ModelFactory & DI | Split per-feature: each package provides its own |
| Source | What was learned |
|---|---|
SDK public API surface (api/OpenAI.net10.0.cs) |
18 sub-namespaces + root, 515 public types, cross-namespace type refs |
SDK source code (src/) |
Internal shared types, Generated + Custom structure, build dependencies |
OpenAI platform docs (developers.openai.com) |
Tool integrations, feature connections, developer workflows |
Colleague notes (package-thoughts.txt) |
Feature groupings, deprecation timeline, Responses ecosystem |
GitHub issues & labels (openai/openai-dotnet) |
Per-feature labels; issue #1069 duplicate enum consolidation |
| GitHub repositories | User scenarios |
| StackOverflow questions | User scenarios |
| Web (articles and examples) | User scenarios |