Any way to extend the video recording time in Reality Composer Pro from 3:00 to any longer value, such as editing preferences in Terminal or other workaround?
Is there any way to use the strap and a USB-C cable as a live video stream input source that would mirror to Quicktime or some other video capture tool?
I am assuming there is no online documentation or user manual for the strap, but please correct me if I'm wrong. Thank you.
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi there,
I'm developing a visionOS app that is using the anchor points and mesh from SceneReconstructionProvider anchor updates. I load an ImmersiveSpace using a RealityView and apply a ShaderGraphMaterial (from a Shader Graph in Reality Composer Pro) to the mesh and use calls to setParameter to dynamically update the material on very rapid frequency. The mesh is locked (no more updates) before the calls to setParameter. This process works for a few minutes but then eventually I get the following error in the console:
assertion failure: Index out of range (operator[]:line 789) index = 13662, max = 1
With the following stack trace:
Thread 1 Queue : com.apple.main-thread (serial)
#0 0x00000002880f90d0 in __abort_with_payload ()
#1 0x000000028812a6dc in abort_with_payload_wrapper_internal ()
#2 0x000000028812a710 in abort_with_payload ()
#3 0x0000000288003f40 in _os_crash_msg ()
#4 0x00000001dc9ff624 in re::ecs2::ComponentBucketsBase::addComponent ()
#5 0x00000001dc9ffadc in re::ecs2::ComponentBucketsBase::moveComponent ()
#6 0x00000001dc8b0278 in re::ecs2::MaterialParameterBlockArrayComponentStateImpl::processPreparingComponents ()
#7 0x00000001dc8b05e4 in re::ecs2::MaterialParameterBlockArraySystem::update ()
#8 0x00000001dd008744 in re::Scheduler::executePhase ()
#9 0x00000001dc032ec4 in re::Engine::executePhase ()
#10 0x0000000248121898 in RCPSharedSimulationExecuteUpdate ()
#11 0x00000002264e488c in __59-[MRUISharedSimulation _doJoinWithConnectionContext:error:]_block_invoke.44 ()
#12 0x0000000268c5fe9c in _UIUpdateSequenceRunNext ()
#13 0x00000002696ea540 in schedulerStepScheduledMainSectionContinue ()
#14 0x000000026af8d284 in UC::DriverCore::continueProcessing ()
#15 0x00000001a1bd4e6c in CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION ()
#16 0x00000001a1bd4db0 in __CFRunLoopDoSource0 ()
#17 0x00000001a1bd44f0 in __CFRunLoopDoSources0 ()
#18 0x00000001a1bd3640 in __CFRunLoopRun ()
#19 0x00000001a1bce284 in _CFRunLoopRunSpecificWithOptions ()
#20 0x00000001eff12d2c in GSEventRunModal ()
#21 0x00000002697de878 in -[UIApplication _run] ()
#22 0x00000002697e33c0 in UIApplicationMain ()
#23 0x00000001b56651e4 in closure #1 (Swift.UnsafeMutablePointer<Swift.Optional<Swift.UnsafeMutablePointer<Swift.Int8>>>) -> Swift.Never in SwiftUI.KitRendererCommon(Swift.AnyObject.Type) -> Swift.Never ()
#24 0x00000001b5664f08 in SwiftUI.runApp<τ_0_0 where τ_0_0: SwiftUI.App>(τ_0_0) -> Swift.Never ()
#25 0x00000001b53ad570 in static SwiftUI.App.main() -> () ()
#26 0x0000000101bc7b9c in static MetalRendererApp.$main() ()
#27 0x0000000101bc7bdc in main ()
#28 0x0000000197fd0284 in start ()
Any advice on how to solve this or prevent the error?
Thanks!
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
Reality Composer Pro
Shader Graph Editor
That title would have made a great WWDC Sessions. Unfortunately, it seems like nothing is new in Reality Composer Pro this year. I've noticed at all versions of the Xcode Beta this summer have shipped with Reality Composer Pro version 2.0. There have been slight bumps in the build number. I haven't found any new features or seen any documentation to indicate that anything has changed.
So the question is, what is the state of Reality Composer Pro? Should we continue to use this tool or start doing everything in code? A huge number of Sample Projects use Reality Composer Pro, so it seems like Apple is still using it even if they didn't update it this year.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
I developed an app on VisionPro and created a button that allows users to exit the app instead of being forced to exit. I use the ”exit (0)“ scheme to exit the application, but when I re-enter, the loaded window is not the initial window, so is there any relevant code that can be used? Thank you
Is there any way to convert TextureResource to Image
是对原本VisionPro每个App的内存限制做了扩展嘛?放宽了内存限制么?
After writing the code, when debugging on VisionPro, the program will encounter a blocking situation when running from Xcode to VisionPro. It will take a long time for the execution information to appear on the Xcode console
I have a question about Apple’s preinstalled visionOS app “Encounter Dinosaurs.”
In this app, the dinosaurs are displayed over the real-world background, but the PhysicallyBasedMaterial (PBM) in RealityKit doesn’t appear to respond to the actual brightness of the environment.
Even when I change the lighting in the room, the dinosaurs’ brightness and shading remain almost the same.
If this behavior is intentional — for example, if the app disables real-world lighting influence or uses a fixed lighting setup — could someone explain how and why it’s implemented that way?
I’m currently developing a visionOS app that includes an RCP scene with a large USDZ file (around 2GB).
Each time I make adjustments to the CG model in Blender, I export it as USDZ again, place it in the RCP scene, and then build the app using Xcode.
However, because the USDZ file is quite large, the build process takes a long time, significantly slowing down my development speed.
For example, I’d like to know if there are any effective ways to:
Improve overall build performance
Reduce the time between updating the USDZ file and completing the build
Any advice or best practices for optimizing this workflow would be greatly appreciated.
Best regards,
Sadao
Hello!
Back from last week's amazing visit to Cupertino for the Game Dev session and diving back into Vision Pro experimentation.
I've exported a simple geometry nodes with animation test from Blender for use in RCP, with intended output to Vision Pro. I've attached a few screenshots showing the node setup and how it animates over time.
I select the Cube mesh and export as .usdc with animation. In the finder via quick look, I can actually see it working! If I try exporting as .usdz, however, i'm not seeing any animation in the finder preview.
Next, I import the .usdc file to RCP and add an Animation Library component to the cube mesh, but am not seeing any animation selectable, even though I see animation playing back in preview.
Next, I import the .usdc into Maya (via proper USD Stage pipeline - i'm learning to be USD compliant for authoring!) to verify if the animation is working, and it does.
What step(s) am I missing to get this working in Reality Composer Pro? My goal is to experiment with animating these geometry node instances - along with color animation if possible - over to Vision Pro for full scale, immersive presentation.
Of particular note, I am not a programmer, so I am trying my best to brute force this the only way I currently know possible, by keyframe animation and importing through Reality Composer Pro. I realize that, ideally, I should be learning how to leverage the code portion so I can start programatically controlling my 3d entities (with animation), but need more hand holding and real-world examples to help me get there. Thx!
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
I am creating an Augmented Reality iOS (Not VisionOS) app using scenes created in Reality Composer Pro.
I'd like my code to send a notification to a RCP scene that plays a timeline. The RCP interface has the option to set up a behaviour for this purpose:
This Forum thread https://developer.apple.com/forums/thread/756978 suggests the code I need for sending a notification is:
name: NSNotification.Name("RealityKit.NotificationTrigger"),
object: nil,
userInfo: [
"RealityKit.NotificationTrigger.Scene": scene,
"RealityKit.NotificationTrigger.Identifier": "HideCharacter"
]
)
but the 'scene' var needs to point to the relevant RCP scene, which is loaded within a UIViewRepresentable ARView (because even in iOS26 it seems RealityKit/RealityViews aren't quite ready for AR use) and I can't work out how to correctly access it. Examples in the link above are for working with RealityKit and VisionOS only.
Code for loading the scene is as follows. How can I get the notification code above to be situated in a separate SwiftUI View and send the notification to the RCP scene?
typealias UIViewType = ARView
func makeUIView(context: Context) -> ARView {
// Create an ARView
let arView = ARView(frame: .zero)
// Configure it
let arConfiguration = ARWorldTrackingConfiguration()
arConfiguration.planeDetection = [.horizontal]
arView.session.run(arConfiguration)
// Load in Reality Composer Pro scene
let scene = try! Entity.load(named:"myScene)", in: realityKitContentBundle)
// Create a horizontal plane anchor
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2)))
// Append the scene to the anchor
anchor.children.append(scene)
// Append the anchor to the ARView
arView.scene.anchors.append(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
}
}
the timeline editor feels often unfinished. Setting the time cursor to a different time is often not reflected in the preview.
You either have to click a clip or wait.
sometimes the cursor even disappears, eg. when switching tabs to shadergraph
Not being able to select and move multiple clips is missing.
There is also no snapping to clips or time cursor as found in other tools.
And then there is the timeline compile bug
https://developer.apple.com/forums/thread/810868
The timeline as it is, is a good start but it definitely needs some more love to be on par with other commercial tools like Unity or After Effects.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Is there a way to stop a Reality Composer Timeline ? And restart it later on. For me it looks like you can only start a timeline via notification.
and related to it. i have the issue that if the notification to start a certain timeline happens twice or more, it looks as if the timeline actually plays multiple times and animations start to glitch and jump. what is best practise here to avoid this.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error:
RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound)
This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit)
RealityView { content in
do {
let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)])
bgEntity.position.z = -0.2
content.add(bgEntity)
let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial")
let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial])
content.add(testEntity)
content.cameraTarget = testEntity
} catch {
print("Shader Graph Load Error:")
dump(error)
}
}
.realityViewCameraControls(.orbit)
.edgesIgnoringSafeArea(.all)
Feedback ID: FB15081296
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
RealityKit
Reality Composer Pro
Shader Graph Editor
What is recommended best practice for importing a Blender 3D file into RCP? I assume as a .usdz file? Is there a WWDC24 session or other Apple resource that best explains this. I want to make sure I provide the right format/file to RCP from Blender.
Hey everyone,
I'm working on an object viewer where users can place objects in a real room using AR, and I want both visionOS (Apple Vision Pro) and iOS devices (iPad, iPhone) to participate in the same shared spatial experience. The idea is that a user with a Vision Pro can place an object, and peers using iPhones/iPads can see the same object in the same position in their AR view.
I've looked into ARKit's Shared ARWorldMap and MultipeerConnectivity, but I'm not sure if this extends seamlessly to visionOS or if Apple has an official way to sync spatial data between visionOS and iOS devices.
Has anyone tried sharing a spatial world between visionOS and iOS?
Are there any built-in frameworks that allow for a shared multiuser AR session across these devices?
If not, what would be the best way to sync object positions between them?
Would love to hear if anyone has insights or experience with this! 🚀
Thanks!
We have a project which is currently being built as a XCFramework.
The framework contains a custom component to be used with entities in Reality Composer Pro.
I have tried to se set the RCP Package.swift file to reference the framework package for the in the dependancies.
Nothing that I do with the folder path to reference the code is working.
Do I need to change the project to be using Swift source code instead of a XCFramework?
The component needs to be in the framework as there is a class in the framework that works directly with the custom compoent.
I am able to reference the XCFramework as a Swift Package with other projects.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
I want to make a model with added bones move by dragging it with gestures,This is a model exported using Blender,
What I understand is using IKComponent, but I don't know how to use it specifically
I sketched a idea for a project in Reality Composer on my iPad, thinking when I had a chance to sit down I would work it up in Xcode.
However, when I got back to my computer, I discovered I cannot open a file created in Reality Composer (or the exported Reality file) in Reality Composer Pro.
Am I missing something obvious here, because this seems like a huge oversight.
If anyone, can let me know how to open a file created in Reality Composer in Reality Composer Pro, I would greatly appreciate it. Partly, because there seems to be objects available in Reality Composer that are not in Reality Composer Pro.
Thanks
Stan
I downloaded the file through Scoot, and when I remove VisionPro, the app will call the StreamDelegate method and return ". endEncountered". How can I solve this problem?
Thank you!