Posted on Leave a comment

Take your iPad and iPhone apps even further on Apple Vision Pro

A brand‑new App Store will launch with Apple Vision Pro, featuring apps and games built for visionOS, as well as hundreds of thousands of iPad and iPhone apps that run great on visionOS too. Users can access their favorite iPad and iPhone apps side by side with new visionOS apps on the infinite canvas of Apple Vision Pro, enabling them to be more connected, productive, and entertained than ever before. And since most iPad and iPhone apps run on visionOS as is, your app experiences can easily extend to Apple Vision Pro from day one — with no additional work required.

Timing. Starting this fall, an upcoming developer beta release of visionOS will include the App Store. By default, your iPad and/or iPhone apps will be published automatically on the App Store on Apple Vision Pro. Most frameworks available in iPadOS and iOS are also included in visionOS, which means nearly all iPad and iPhone apps can run on visionOS, unmodified. Customers will be able to use your apps on visionOS early next year when Apple Vision Pro becomes available.

Making updates, if needed. In the case that your app requires a capability that is unavailable on Apple Vision Pro, App Store Connect will indicate that your app isn’t compatible and it won’t be made available. To make your app available, you can provide alternative functionality, or update its UIRequireDeviceCapabilities. If you need to edit your existing app’s availability, you can do so at any time in App Store Connect.

To see your app in action, use the visionOS simulator in Xcode 15 beta. The simulator lets you interact with and easily test most of your app’s core functionality. To run and test your app on an Apple Vision Pro device, you can submit your app for a compatibility evaluation or sign up for a developer lab.

Beyond compatibility. If you want to take your app to the next level, you can make your app experience feel more natural on visionOS by building your app with the visionOS SDK. Your app will adopt the standard visionOS system appearance and you can add elements, such as 3D content tuned for eyes and hands input. To learn how to build an entirely new app or game that takes advantage of the unique and immersive capabilities of visionOS, view our design and development resources.

Posted on Leave a comment

Updated Apple Developer Program License Agreement now available

The Apple Developer Program License Agreement have been revised to support updated policies and upcoming features, and to provide clarification. The revisions include:

  • Definitions, Section 3.3.39: Specified requirements for use of the Journaling Suggestions API.

  • Schedule 1 Section 6.3 and Schedules 2 and 3 Section 7.3: Added clarifying language that the content moderation process is subject to human and systematic review and action pursuant to notices of illegal and harmful content.

  • Schedule 1 Exhibit D Section 3 and Schedules 2 and 3 Exhibit E Section 3: Added language about the Digital Services Act (DSA) redress options available to developers based in the European Union.

View full terms and conditions

Posted on Leave a comment

Inside the Apple Vision Pro labs

As CEO of Flexibits, the team behind successful apps like Fantastical and Cardhop, Michael Simmons has spent more than a decade minding every last facet of his team’s work. But when he brought Fantastical to the Apple Vision Pro labs in Cupertino this summer and experienced it for the first time on the device, he felt something he wasn’t expecting.

“It was like seeing Fantastical for the first time,” he says. “It felt like I was part of the app.”

That sentiment has been echoed by developers around the world. Since debuting in early August, the Apple Vision Pro labs have hosted developers and designers like Simmons in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino. During the day-long lab appointment, people can test their apps, get hands-on experience, and work with Apple experts to get their questions answered. Developers can apply to attend if they have a visionOS app in active development or an existing iPadOS or iOS app they’d like to test on Apple Vision Pro.

Learn more about Apple Vision Pro developer labs

For his part, Simmons saw Fantastical work right out of the box. He describes the labs as “a proving ground” for future explorations and a chance to push software beyond its current bounds. “A bordered screen can be limiting. Sure, you can scroll, or have multiple monitors, but generally speaking, you’re limited to the edges,” he says. “Experiencing spatial computing not only validated the designs we’d been thinking about — it helped us start thinking not just about left to right or up and down, but beyond borders at all.”

And as not just CEO but the lead product designer (and the guy who “still comes up with all these crazy ideas”), he came away from the labs with a fresh batch of spatial thoughts. “Can people look at a whole week spatially? Can people compare their current day to the following week? If a day is less busy, can people make that day wider? And then, what if like you have the whole week wrap around you in 360 degrees?” he says. “I could probably — not kidding — talk for two hours about this.”

‘The audible gasp’

David Smith is a prolific developer, prominent podcaster, and self-described planner. Shortly before his inaugural visit to the Apple Vision Pro developer labs in London, Smith prepared all the necessary items for his day: a MacBook, Xcode project, and checklist (on paper!) of what he hoped to accomplish.

All that planning paid off. During his time with Apple Vision Pro, “I checked everything off my list,” Smith says. “From there, I just pretended I was at home developing the next feature.”

I just pretended I was at home developing the next feature.

David Smith, developer and podcaster

Smith began working on a version of his app Widgetsmith for spatial computing almost immediately after the release of the visionOS SDK. Though the visionOS simulator provides a solid foundation to help developers test an experience, the labs offer a unique opportunity for a full day of hands-on time with Apple Vision Pro before its public release. “I’d been staring at this thing in the simulator for weeks and getting a general sense of how it works, but that was in a box,” Smith says. “The first time you see your own app running for real, that’s when you get the audible gasp.”

Smith wanted to start working on the device as soon as possible, so he could get “the full experience” and begin refining his app. “I could say, ‘Oh, that didn’t work? Why didn’t it work?’ Those are questions you can only truly answer on-device.” Now, he has plenty more plans to make — as evidenced by his paper checklist, which he holds up and flips over, laughing. “It’s on this side now.”

‘We understand where to go’

When it came to testing Pixite’s video creator and editor Spool, chief experience officer Ben Guerrette made exploring interactions a priority. “What’s different about our editor is that you’re tapping videos to the beat,” he says. “Spool is great on touchscreens because you have the instrument in front of you, but with Apple Vision Pro you’re looking at the UI you’re selecting — and in our case, that means watching the video while tapping the UI.”

The team spent time in the lab exploring different interaction patterns to address this core challenge. “At first, we didn’t know if it would work in our app,” Guerrette says. “But now we understand where to go. That kind of learning experience is incredibly valuable: It gives us the chance to say, ‘OK, now we understand what we’re working with, what the interaction is, and how we can make a stronger connection.’”

Chris Delbuck, principal design technologist at Slack, had intended to test the company’s iPadOS version of their app on Apple Vision Pro. As he spent time with the device, however, “it instantly got me thinking about how 3D offerings and visuals could come forward in our experiences,” he says. “I wouldn’t have been able to do that without having the device in hand.”

‘That will help us make better apps’

As lab participants like Smith continue their development at home, they’ve brought back lessons and learnings from their time with Apple Vision Pro. “It’s not necessarily that I solved all the problems — but I solved enough to have a sense of the kinds of solutions I’d likely need,” Smith says. “Now there’s a step change in my ability to develop in the simulator, write quality code, and design good user experiences.”

I’ve truly seen how to start building for the boundless canvas.

Michael Simmons, Flexibits CEO

Simmons says that the labs offered not just a playground, but a way to shape and streamline his team’s thinking about what a spatial experience could truly be. “With Apple Vision Pro and spatial computing, I’ve truly seen how to start building for the boundless canvas — how to stop thinking about what fits on a screen,” he says. “And that will help us make better apps.”

Posted on Leave a comment

Helping customers resolve billing issues without leaving your app

As announced in April, your customers will soon be able to resolve payment issues without leaving your app, making it easier for them to stay subscribed to your content, services, and premium features.

Starting August 14, 2023, if an auto-renewable subscription doesn’t renew because of a billing issue, a system-provided sheet will appear in your app with a prompt that lets customers update the payment method for their Apple ID. You can test this sheet in Sandbox, as well as delay or suppress it using messages and display in StoreKit. This feature is available in iOS 16.4 and iPadOS 16.4 or later, and no action is required to adopt it.

Learn about the system-provided sheet

Learn how to test billing issues in Sandbox

Posted on Leave a comment

List of APIs that require declared reasons now available

Apple is committed to protecting user privacy on our platforms. We know that there are a small set of APIs that can be misused to collect data about users’ devices through fingerprinting, which is prohibited by our Developer Program License Agreement. To prevent the misuse of these APIs, we announced at WWDC23 that developers will need to declare the reasons for using these APIs in their app’s privacy manifest. This will help ensure that apps only use these APIs for their intended purpose. As part of this process, you’ll need to select one or more approved reasons that accurately reflect how your app uses the API, and your app can only use the API for the reasons you’ve selected.

Starting in fall 2023, when you upload a new app or app update to App Store Connect that uses an API (including from third-party SDKs) that requires a reason, you’ll receive a notice if you haven’t provided an approved reason in your app’s privacy manifest. And starting in spring 2024, in order to upload your new app or app update to App Store Connect, you’ll be required to include an approved reason in the app’s privacy manifest which accurately reflects how your app uses the API.

If you have a use case for an API with required reasons that isn’t already covered by an approved reason and the use case directly benefits the people using your app, let us know.

View list of APIs and approved reasons

Submit a request for a new approved reason

Posted on Leave a comment

Take your apps and games beyond the visionOS simulator

Apple Vision Pro compatibility evaluations

We can help you make sure your visionOS, iPadOS, and iOS apps behave as expected on Vision Pro. Align your app with the newly published compatibility checklist, then request to have your app evaluated directly on Vision Pro.

Learn more

Apple Vision Pro developer labs

Experience your visionOS, iPadOS, and iOS apps running on Vision Pro. With support from Apple, you’ll be able to test and optimize your apps for the infinite spatial canvas. Labs are available in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo.

Learn more

Apple Vision Pro developer kit

Have a great idea for a visionOS app that requires building and testing on Vision Pro? Apply for a Vision Pro developer kit. With continuous, direct access to Vision Pro, you’ll be able to quickly build, test, and refine your app so it delivers amazing spatial experiences on visionOS.

Learn more

Posted on Leave a comment

Upcoming price and tax changes for apps, in-app purchases, and subscriptions

The App Store’s commerce and payments system was built to empower you to conveniently set up and sell your products and services on a global scale in 44 currencies across 175 storefronts. When tax regulations or foreign exchange rates change, we sometimes need to update prices on the App Store in certain regions and/or adjust your proceeds. These updates are done using publicly available exchange rate information from financial data providers to help ensure prices for apps and in‑app purchases stay equalized across all storefronts.

On July 25, pricing for apps and in‑app purchases (excluding auto‑renewable subscriptions) will be updated for the Egypt, Nigeria, Tanzania, and Türkiye storefronts. These updates also consider the following tax changes:

  • Egypt: introduction of a value‑added tax (VAT) of 14%
  • Tanzania: introduction of a VAT of 18% and a digital service tax of 2%
  • Türkiye: increase of the VAT rate from 18% to 20%

How this impacts pricing

  • If you’ve selected Egypt, Nigeria, Tanzania, or Türkiye as the base storefront for your app or in‑app purchase (excluding auto‑renewable subscriptions), the price won’t change on that storefront. Prices on other storefronts will be updated to maintain equalization with your chosen base price.
  • If the base storefront for your app or in‑app purchase (excluding auto‑renewable subscriptions) isn’t Egypt, Nigeria, Tanzania, or Türkiye, prices will increase on the Egypt, Nigeria, Tanzania, and Türkiye storefronts.
  • If your in‑app purchase is an auto‑renewable subscription or if you manually manage prices on storefronts instead of using the automated equalized prices, your prices won’t change.

The Pricing and Availability section of My Apps has been updated in App Store Connect to display these upcoming price changes. As always, you can change the prices of your apps, in‑app purchases, and auto‑renewable subscriptions at any time.

How this impacts proceeds and tax administration

Your proceeds for sales of apps and in-app purchases (including auto‑renewable subscriptions) will change to reflect the new tax rates and updated prices. Exhibit B of the Paid Applications Agreement has been updated to indicate that Apple collects and remits applicable taxes in Egypt and Tanzania.

Learn more about managing your prices

Viewing new pricing

Selecting a base country or region

Pricing and availability start times by region

Setting in‑app purchase pricing

Posted on Leave a comment

Providing safe app experiences for families

The App Store was created to be a safe and trusted place for users to get apps, and a great business opportunity for developers. Apple platforms and the apps you build have become important to many families, as children use our products and services to explore the digital world and communicate with family and friends. We hold apps for kids and those with user-generated content and interactions to the highest standards. To continue delivering safe experiences for families together, we wanted to remind you about the tools, resources, and requirements that are in place to help keep users safe in your app.

Made for Kids

If you have an app that’s intended for kids, we encourage you to use the Kids category, which is designed for families to discover age-appropriate content and apps that meet higher standards that protect children’s data and offer added safeguards for purchases and permissions (e.g., for Camera, Location, etc).

Learn more about building apps for Kids.

Parental controls

Your app’s age rating is integrated into our operating systems and works with parental control features, like Screen Time. Additionally, with Ask To Buy, when kids want to buy or download a new app or in-app purchase, they send a request to the family organizer. You can also use the Managed Settings framework to ensure the content in your app is appropriate for any content restrictions that may have been set by a parent. The Screen Time API is a powerful tool for parental control and productivity apps to help parents manage how children use their devices. Learn more about the tools we provide to support parents to help them know, and feel good about, what kids are doing on their devices.

Sensitive and inappropriate content

Apps with user-generated content and interactions must include a set of safeguards to protect users, including a method for filtering objectionable material from being posted to the app, a mechanism to report offensive content and support timely responses to concerns, and the ability to block abusive users. Apps containing ads must include a way for users to report inappropriate and age-inappropriate ads.

iOS 17, iPadOS 17, macOS Sonoma, and watchOS 10, introduce the ability to detect and alert users to nudity in images and videos before displaying them onscreen. The Sensitive Content Analysis framework uses on-device technology to detect sensitive content in your app. Tailor your app experience to handle detected sensitive content appropriately for users that have Communication Safety or Sensitive Content Warning enabled.

Supporting users

​Users have multiple ways to report issues with an app, like Report a Problem. Users can also communicate app feedback to other users and developers by writing reviews of their own; users can Report a Concern with other individual user reviews. You should closely monitor your user reviews to improve the safety of your app, and have the ability to address concerns directly. Additionally, if you believe another app presents a trust or safety concern, or is in violation of our guidelines, you can share details with Apple to investigate.

These user review tools are critical to informing the work we do to keep the App Store safe. Apple deploys a combination of machine learning, automation, and human review to monitor concerns related to abuse submitted via user reviews and Report a Problem. We monitor for topics of concern such as reports of fraud and scams, copycat violations, inappropriate content and advertising, privacy and safety concerns, objectionable content and child exploitation; and use techniques such as semi-supervised Correlation Explanation (CorEx) models, and Bidirectional Encoder Representations from Transformers (BERT)-based large language models specifically trained to recognize these topics. Flagged topics are then surfaced to our App Review team, who investigate the app further and take action if violations of our guidelines are found.

We believe we have a shared mission with you as developers to create a safe and trusted experience for families, and look forward to continuing that important work. Here are some resources that you may find helpful:

Sensitive Content Analysis framework

Learn about Ratings, Reviews, and Responses

Report a Trust & Safety concern related to another app

Learn about the ScreenTime Framework

Learn about building apps for Kids

Posted on Leave a comment

visionOS SDK now available

You can now start creating cutting-edge spatial computing apps for the infinite canvas of Apple Vision Pro. Download Xcode 15 beta 2, which includes the visionOS SDK and Reality Composer Pro (a new tool that makes it easy to preview and prepare 3D content for visionOS). Add a visionOS target to your existing project or build an entirely new app, then iterate on your app in Xcode Previews. You can interact with your app in the all-new visionOS simulator, explore various room layouts and lighting conditions, and create tests and visualizations. New documentation and sample code are also available to help you through the development process.

Download Xcode 15 beta 2

Learn about developing for visionOS

Posted on Leave a comment

Spotlight on: Developer tools for visionOS

With the visionOS SDK, developers worldwide can begin designing, building, and testing apps for Apple Vision Pro.

For Ryan McLeod, creator of iOS puzzle game Blackbox, the SDK brought both excitement and a little nervousness. “I didn’t expect I’d ever make apps for a platform like this — I’d never even worked in 3D!” he says. “But once you open Xcode you’re like: Right. This is just Xcode. There are a lot of new things to learn, of course, but the stuff I came in knowing, the frameworks — there’s very little change. A few tweaks and all that stuff just works.”

visionOS is designed to help you create spatial computing apps and offers many of the same frameworks found on other Apple platforms, including SwiftUI, UIKit, RealityKit, and ARKit. As a result, most developers with an iPadOS or iOS app can start working with the platform immediately by adding the visionOS destination to their existing project.

“It was great to be able to use the same familiar tools and frameworks that we have been using for the past decade developing for iOS, iPadOS, macOS, and watchOS,” says Karim Morsy, CEO and co-founder of Algoriddim. “It allowed us to get our existing iPad UI for djay running within hours.”

Even for developers brand new to Apple platforms, the onboarding experience was similarly smooth. “This was my first time using a Mac to work,” says Xavi H. Oromí, chief engineering officer at XRHealth. “At the beginning, of course, a new tool like Xcode takes time to learn. But after a few days of getting used to it, I didn’t miss anything from other tools I’d used in the past.”

In addition to support for visionOS, the Xcode 15 beta also provides Xcode Previews for visionOS and a brand new Simulator, so that people can start exploring their ideas immediately. “Transitioning between ideas, using the Simulator to test them, it was totally organic,” says Oromí. “It’s a great tool for prototyping.”

In the visionOS simulator, developers can preview apps and interactions on Vision Pro. This includes running existing iPad and iPhone apps as well as projects that target the visionOS SDK. To simulate eye movement while in an app, you can use your cursor to focus an element, and a click to indicate a tap gesture. In addition to testing appearance and interactions, you can also explore how apps perform in different background and lighting scenarios using Simulated Scenes. “It worked out of the box,” says Zac Duff, CEO and co-founder of JigSpace. “You could trust what you were seeing in there was representative of what what you would see on device.”

The SDK also includes a new development tool — Reality Composer Pro — which lets you preview and prepare 3D content for your visionOS apps and games. You can import and organize assets, add materials and particle effects, and bring them right back into Xcode with thanks to tight build integration. “Being able to quickly test things in Reality Composer Pro and then get it up and running in the simulator meant that we were iterating quickly,” says Duff. “The feedback loop for developing was just really, really short.”

McLeod had little experience with 3D modeling and shaders prior to developing for visionOS, but breaking Blackbox out of its window required thinking in a new dimension. To get started, McLeod used Reality Composer Pro to develop the almost-ethereal 3D bubbles that make up Blackbox’s main puzzle screen. “You can take a basic shape like a sphere and give it a good shader and make sure that it’s moving in a believable way,” says McLeod. “That goes incredibly far.”

The visionOS SDK also brings new Instruments like RealityKit Trace to developers to help them optimize the performance of their spatial computing apps. As a newcomer to using RealityKit in his apps, McLeod notes that he was “really timid” with the rendering system at first. “Anything that’s running every single frame, you’re thinking, ‘I can’t be checking this, and animating that, and spawning things. I’m going to have performance issues!’” he laughs. “I was pretty amazed at what the system could handle. But I definitely still have performance gains to be made.”

For developers like Caelin Jackson-King, an iOS software engineer for Splunk’s augmented reality team, the SDK also prompted great team discussions about updating their existing codebase. “It was a really good opportunity to redesign and refactor our app from the bottom up to have a much cleaner architecture that supported both iOS and visionOS,” says Jackson-King.

The JigSpace team had similar discussions as they brought more RealityKit and SwiftUI into their visionOS experience. “Once we got comfortable with the system, it was like a paradigm shift,” says Duff. “Rather than going, ‘OK, how do we do this thing?’, we could be more like, ‘What do we want to do next?’ Because we now have command of the tools.”

You can explore those tools now on developer.apple.com along with extensive technical documentation and sample code, design kits and tools for visionOS, and updates to the Human Interface Guidelines.

Download the visionOS SDK

Learn more about developing for visionOS

Prepare your apps for visionOS

Explore sessions about visionOS