IntelligenceX
Cmdlet

Start-IntelligenceXThread

Namespace IntelligenceX.PowerShell
Inputs
IntelligenceX.OpenAI.IntelligenceXClient
Outputs
IntelligenceX.Json.JsonValue IntelligenceX.OpenAI.AppServer.Models.ThreadInfo

Creates a new conversation thread.

Examples

Example 1


Start-IntelligenceXThread -Model "gpt-5.3-codex"
        

Start a thread with the default model

Example 2


Start-IntelligenceXThread -Model "gpt-5.3-codex" -CurrentDirectory "C:\repo" -Sandbox "workspace"
        

Start a thread scoped to a repo and sandboxed

Example 3


Start-IntelligenceXThread -Model "gpt-5.3-codex" -ApprovalPolicy "auto"
        

Start a thread with an approval policy

Common Parameters

This command supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable.

For more information, see about_CommonParameters.

Syntax

Start-IntelligenceXThread -Model <string> [-ApprovalPolicy <string>] [-Client <IntelligenceXClient>] [-CurrentDirectory <string>] [-Raw] [-Sandbox <string>] [<CommonParameters>]
#

Parameters

Model string requiredposition: namedpipeline: false
Model identifier to use (for example gpt-5.3-codex).
ApprovalPolicy string optionalposition: namedpipeline: false
Approval policy name to use.
Client IntelligenceXClient optionalposition: namedpipeline: true (ByValue)
Client instance to use. Defaults to the active client.
CurrentDirectory string optionalposition: namedpipeline: false
Working directory to pass to the app-server.
Raw SwitchParameter optionalposition: namedpipeline: false
Return raw JSON response.
Sandbox string optionalposition: namedpipeline: false
Sandbox mode to use (for example 'workspace' or 'danger-full-access').

Outputs

IntelligenceX.Json.JsonValue, IntelligenceX.OpenAI.AppServer.Models.ThreadInfo