✨ Discover this trending post from TechCrunch 📖
📂 **Category**: AI,Apps,agentic,Apple,coding,developers,xcode
💡 **What You’ll Learn**:
Apple brings proxy coding to Xcode. The company on Tuesday announced the launch of Xcode 26.3, which will allow developers to use agent tools, including Anthropic’s Claude Agent and OpenAI’s Codex, directly in Apple’s official application development suite.
The Xcode 26.3 release candidate is available to all Apple developers today from the developer site and will arrive in the App Store shortly.
This latest update follows the release of Xcode 26 last year, which introduced support for ChatGPT and Cloud within Apple’s integrated development environment (IDE) used by those who create apps for iPhone, iPad, Mac, Apple Watch, and other Apple hardware platforms.
The integration of agent coding tools allows AI models to leverage more Xcode features to perform their tasks and perform more complex automation.
The models will also have access to current Apple developer documentation to ensure they are using the latest APIs and following best practices as they build them.
At launch, agents can help developers explore their project, understand its structure and metadata, then build the project and run tests to see if there are any bugs and fix them, if so.

To prepare for this launch, Apple said it worked closely with Anthropic and OpenAI to design the new experience. Specifically, the company said it has done a lot of work to improve token usage and tool invocation, so that agents run efficiently in Xcode.
Xcode leverages MCP (Model Context Protocol) to expose its capabilities to agents and connect them to its tools. This means that Xcode can now work with any MCP-compliant external agent for things like project discovery, changes, file management, previews, snippets, and access to the latest documentation.
Developers who want to try out the proxy coding feature must first download the proxies they want to use from Xcode Settings. They can also link their accounts to AI providers by logging in or adding their API key. An in-app drop-down menu allows developers to choose which model version they want to use (for example, GPT-5.2-Codex vs. GPT-5.1 mini).
In the prompt box on the left side of the screen, developers can tell the agent what type of project they want to create or change the code they want to generate using natural language commands. For example, they can instruct Xcode to add a feature to their app that uses one of the frameworks provided by Apple, and how it should look and work.

When the agent starts running, it breaks down tasks into smaller steps, so it’s easier to see what’s happening and how the code is changing. He will also look for the documentation he needs before he starts programming. Changes are highlighted visually within the code, and project text on the side of the screen lets developers see what’s happening under the hood.
Apple believes this transparency can especially help new developers learning to code. To that end, the company is hosting a “Coding How-to” workshop Thursday on its developer site, where users can watch and learn how to use proxy coding tools as they code in real-time using their own version of Xcode.
At the end of the process, the AI agent verifies that the code it generated works as expected. Armed with the results of its tests on this front, the agent can iterate further on the project if necessary to fix bugs or other issues. (Apple notes that asking an agent to think through their plans before writing code can sometimes help improve the process, because it forces the agent to do some advance planning.)
Additionally, if developers are not satisfied with the results, they can easily revert their code to its original at any time, as Xcode creates milestones every time the agent makes a change.
⚡ **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#Xcode #moving #proxy #coding #deeper #integrations #OpenAI #Anthropic**
🕒 **Posted on**: 1770170669
🌟 **Want more?** Click here for more info! 🌟
