Discuss Spatial Computing on Apple Platforms.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

VisionPro MRApp issue: location reset
We use Unity6+VisonOS2.2 to develop an MR Application. App Mode: RealityKit with PolySpatial, In the actual test, we found that when my moving position is more than 80~100 meters away from the starting position of the application, my current position will be reset to Vector.zero, which will cause my application experience to be very bad. Is anyone experiencing the same problem? Is there a solution?
0
0
221
Jan ’25
Is it possible to live render CMTaggedBuffer / MV-HEVC frames in visionOS?
Hey all, I'm working on a visionOS app that captures live frames from the left and right cameras of Apple Vision Pro using cameraFrame.sample(for: .left/.right). Apple provides documentation on encoding side-by-side frames into MV-HEVC spatial video using CMTaggedBuffer: Converting Side-by-Side 3D Video to MV-HEVC My question: Is there any way to render tagged frames (e.g. CMTaggedBuffer with .stereoView(.leftEye/.rightEye)) live, directly to a surface in RealityKit or Metal, without saving them to a file? I’d like to create a true stereoscopic (spatial) live video preview, not just render two images side-by-side. Any advice or insights would be greatly appreciated!
2
0
245
Aug ’25
USDZ Security
I am working on an app that will allow a user to load and share their model files (usdz, usda, usdc). I'm looking at security options to prevent bad actors. Are there security or validation methods built into ARKit/RealityKit/CloudKit when loading models or saving them on the cloud? I want to ensure no one can inject any sort of exploit through these file types.
0
0
472
Jul ’25
Hidden window/volume system overlays in Full Space
When I show a window while a sky sphere is shown, the handles to drag/close/resize the window are hidden. The colliders still work, so they are there, but only the visuals are hidden. I already know from another project, that this also happens to volumes. They only appear once you get closer to the window or if the sky sphere gets removed. Is this a known issue or is there a fix for that? .persistentSystemOverlays(.visible)does not fix it Xcode 16.3.0 Beta, visionOS 2.4
5
0
330
Mar ’25
Material showing only half?
Hi, I'm currently implementing 180° / 360° property for immersive video in my app. I was able to implement 360° easily by just giving VideoMaterial to flipped sphere. However, I'm bit stuck at 180°. I want to implement by setting VideoMaterial to hemisphere mesh. But since RealityKit doesn't provide default function such like MeshResource.generateHemisphere yet, I just want to apply VideoMaterial half front visible, and half back transparent. I thought this would make my sphere looks like hemisphere. But I can't find my way to implement this method.. I would appreciate any advice / idea / information that might help.
0
0
71
Oct ’25
Share Extensions embedded in visionOS apps
I'm trying to add a feature to my app to allow a user to import items from other apps, like Safari, via the share sheet. I've done this many times on iOS/iPadOS easily with a Share Extension. From what I can tell, Xcode tells me share extensions are not available on visionOS - though my experience on device tells me differently (It seems Reminders, Notes & more implement them somehow.) I was finally able to get it working on device only...but I can now no longer test in the simulator, and have not found a way to distribute this app. When attempting to run on the simulator, I get this issue: Please try again later. Appex bundle at /Users/jason/Library/Developer/CoreSimulator/Devices/09A70160-4F4F-4F5E-B679-F6F7D876D7EF/data/Library/Caches/com.apple.mobile.installd.staging/temp.6OAEZp/extracted/LaunchBar.app/PlugIns/LaunchBarShareExtension.appex with id co.swiftfox.LaunchBar.ShareExtension specifies a value (com.apple.share-services) for the NSExtensionPointIdentifier key in the NSExtension dictionary in its Info.plist that does not correspond to a known extension point. When trying to archive an upload to test flight, I get this similar error: Invalid Info.plist value. The value for the key 'DTPlatformName' in bundle LaunchBar.app/PlugIns/LaunchBarShareExtension.appex is invalid. (ID: 207610c7-b7e1-48be-959b-22a43cd32d16) The app is for visionOS only - which I'm thinking might be the problem? The share extension is "Designed For iPhone" and requires me to include iPhone as a run destination. In the worst case I can build an iPhone UI for the app but I'd rather not, as it is very specific to visionOS. Has anyone successfully launched a share extension on a visionOS only app? I have an iPad app with a share extension that shows up fine on visionOS, but the issue seems to be specifically with visionOS only apps.
1
0
102
Oct ’25
visionOS 2.0 – Persist room‑fixed RealityKit entities on walls across app launches (WorldAnchor?)
I want to let users place 2D/3D “artworks” on detected walls and have them reappear in exactly the same real‑world spot after quitting and relaunching the app (like widgets do, but for my own entities).Environment: Xcode 26, visionOS 2.0, RealityKit + ARKitSession/WorldTrackingProvider Entities are parented to a holder that’s aligned to a wall via plane/mesh raycasts What I’ve tried: Create a WorldAnchor at placement, save UUID + full 4×4 transform On next launch, re-create the WorldAnchor (or set the saved transform) and attach the entity Gate restore on relocalization/mesh updates and disable all raycast/search after restore Issue: After relaunch, placement still resolves relative to current device pose, not the same wall position. Questions: Is there a public API in visionOS 2.0 to persist app‑managed world anchors across sessions (room‑fixed), e.g., AnchorStore or equivalent? If not, what’s the recommended pattern to reliably restore wall‑anchored content? Are persistence features mentioned for widgets/windows available to third‑party RealityKit entities?
0
0
176
Oct ’25
I am developing a Immersive Video App for VisionOs but I got a issue regarding app and video player window
In Vision OS app, We have two types of windows: Main App Window – This is the default window that launches when the app starts. It displays the video listings and other primary content. Immersive Space Window – This opens only when a user starts streaming or playing a video. Issue: When entering the immersive space, the main app window remains visible in front of it unless manually closed. To avoid this, I currently close the main window when transitioning to immersive space and reopen it when exiting. However, this causes the app to restart instead of resuming from its previous state. Desired Behavior: I want the main app window to retain its state and seamlessly resume from where it was before entering immersive mode, rather than restarting. Attempts & Challenges: Tried managing opacity, visibility, and state preservation, but none worked as expected. Couldn’t find a way to push the main window to the background while bringing the immersive space to the foreground. Looking for a solution to keep the main window’s state intact while transitioning between immersive and normal modes.
1
0
118
Mar ’25
Safari-like toolbar in visionOS
I like the toolbar visionOS's Safari uses for back & forward page, share, etc. It floats above the window. My attempt to do this with ornaments isn't as satisfying as they partially cover the window. My attempts with toolbar haven't produced visible results. Is this Safari-style toolbar for visionOS exposed by Apple in the API's? If so, could someone point me to documentation or sample code? Thanks!
1
0
191
Oct ’25
Getting the world position of a QR code
Hi, would love for your help in that matter. I try to get the position in space of two QR codes to make an alignment to their positions in space. The detection shows that the QR codes position is always 0,0,0 and I don't understand why. Here's my code: import SwiftUI import RealityKit import RealityKitContent struct AnchorView: View { @ObservedObject var qrCoordinator: QRCoordinator @ObservedObject var coordinator: ImmersiveCoordinator let qrName: String @Binding var startQRDetection: Bool @State private var anchor: AnchorEntity? = nil @State private var detectionTask: Task<Void, Never>? = nil var body: some View { RealityView { content in // Add the QR anchor once (must exist before detection starts) if anchor == nil { let imageAnchor = AnchorEntity(.image(group: "QRs", name: qrName)) content.add(imageAnchor) anchor = imageAnchor print("📌 Created anchor for \(qrName)") } } .onChange(of: startQRDetection) { enabled in if enabled { startDetection() } else { stopDetection() } } .onDisappear { stopDetection() } } private func startDetection() { guard detectionTask == nil, let anchor = anchor else { return } detectionTask = Task { var detected = false while !Task.isCancelled && !detected { print("🔎 Checking \(qrName)... isAnchored=\(anchor.isAnchored)") if anchor.isAnchored { // wait a short moment to let transform update try? await Task.sleep(nanoseconds: 100_000_000) let worldPos = anchor.position(relativeTo: nil) if worldPos != .zero { // relative to modelRootEntity if available var posToSave = worldPos if let modelEntity = coordinator.modelRootEntity { posToSave = anchor.position(relativeTo: modelEntity) print("converted to model position") } else { print("⚠️ modelRootEntity not available, using world position") } print("✅ \(qrName) detected at position: world=\(worldPos) saved=\(posToSave)") if qrName == "reanchor1" { qrCoordinator.qr1Position = posToSave let marker = createMarker(color: [0,1,0]) marker.position = .zero // sits directly on QR marker.position = SIMD3<Float>(0, 0.02, 0) anchor.addChild(marker) print("marker1 added") } else if qrName == "reanchor2" { qrCoordinator.qr2Position = posToSave let marker = createMarker(color: [0,0,1]) marker.position = posToSave // sits directly on QR marker.position = SIMD3<Float>(0, 0.02, 0) anchor.addChild(marker) print("marker2 added") } detected = true } else { print("⚠️ \(qrName) anchored but still at origin, retrying...") } } try? await Task.sleep(nanoseconds: 500_000_000) // throttle loop } print("🛑 QR detection loop ended for \(qrName)") detectionTask = nil } } private func stopDetection() { detectionTask?.cancel() detectionTask = nil } private func createMarker(color: SIMD3<Float>) -> ModelEntity { let sphere = MeshResource.generateSphere(radius: 0.05) let material = SimpleMaterial(color: UIColor( red: CGFloat(color.x), green: CGFloat(color.y), blue: CGFloat(color.z), alpha: 1.0 ), isMetallic: false) let marker = ModelEntity(mesh: sphere, materials: [material]) marker.name = "marker" return marker } }
2
0
479
Oct ’25
vision shareplay nearby codes expired
it looks like one week after accepting as a nearby other AVP device... it expires since we are providing our clients for a timeless app to walk inside archtiecture, it's a shame that not technical staff should connect every week 5 devices to work together is there any roundabout for this issue or straight to the wishlist ? thanks for the support !!
0
0
53
Sep ’25
VisionPro camera frame rate
Hi, I'm working with CameraFrameProvider from Enterprise API. Is it always capped at 30fps, or is there something I can switch to get more? I assume it is capped at 30, so let me cram in additional question here :). If I'd get a developer strap and attach an external camera capable of doing >30fps, will I get the full stream, or some other limitation will kick in?
2
0
113
Apr ’25
How to create a MultiplayerDelegate and use TabletopNetworkSession features?
When building a multiplayer Tabletop game, the documentation includes how to attach a custom TabletopNetworkSessionCoordinator, which could be used in addition to TabletopGame.MultiplayerDelegate. But so far, we have been unable to create these types of custom coordinators or have a delegate that works. Our current setup with our generic GroupActivity works by sending the session to TabletopGame's coordinateWithSession method (like in the current sample project), but we didn't find a way to access and control, for example, the arbiter, seats, player events, among other features mentioned on https://developer.apple.com/documentation/tabletopkit/tabletopnetworksession. Is correct to expect having access to the participants, messenger, or journal without having to maintain a parallel coordinator?   possibly we are missing something here; any suggestions?
2
0
235
Apr ’25
Cursor display issue on attachment view in immersive space
While using Screen Mirroring in developer mode within my immersive space, I noticed an alignment issue with the computer cursor (transparent circle). When I move it toward an attachment view, the cursor remains horizontal instead of aligning with the surface of the attachment view. It shows correctly on a 2D window only wrong on attachment view. Is this behavior a bug, or could it be caused by a missing or incorrect configuration on the attachment view? Want help, thanks.
1
0
90
Apr ’25