Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Fullscreen detection using Core Graphic
Hi, I am trying to detect if all the screen are in fullscreen mode. The current approach is to get all windows' information from CGWindowListCopyWindowInfo and then compare the frame and coordinate with the frame of NSScreen.screens. However, there is a problem, the y position of the window seems to be relative to the screen. As it is not absolute position, I cannot compare it with the coordinate of the screen. Does anyone know if there are other information that I can use? Or is there a way to retrieve the absolute position or screen ID from the GCWIndow object?
2
0
100
Apr ’25
Skybox for iOS/swiftUI not working
I have tried every combination of suggestions to get a skybox to appear. Using swiftUI, realityKit and iOS. Non immersive environment. Does anyone have code that works to display a skybox. When i use a do/catch loop i get environmentResource not found. I have checked the syntax, ensured the folder is referencing the target, used the same name for the folder as the file, the file is a .hdr (i assume this is supported), i have moved the file folder to the top level - no change.
2
0
441
1w
WWDC25 Metal & game technologies group lab
Hello, Thank you for attending today’s Metal & game technologies group lab at WWDC25! We were delighted to answer many questions from developers and energized by the community engagement. We hope you enjoyed it and welcome your feedback. We invite you to carry on the conversation here, particularly if your question appeared in Slido and we were unable to answer it during the lab. If your question received feedback let us know if you need clarification. You may want to ask your question again in a different lab e.g. visionOS tomorrow. (We realize that this can be confusing when frameworks interoperate) We have a lot to learn from each other so let’s get to Q&A and make the best of WWDC25! 😃 Looking forward to your questions posted in new threads.
2
0
378
Jul ’25
metal-cpp syntax for MTL::Buffer float2 parameter
I'm trying to pass a buffer of float2 items from CPU to GPU. In the kernel, I can provide a parameter for the buffer: device const float2* values for example. How do I specify float2 as the type for the MTL::Buffer? I managed to get the code to work by "cheating" by defining a simple class that has the same data members as a float2, but there is probably a better way. class Coord_f { public: float x{0.0f}; float y{0.0f}; }; then using code to allocate like this: NS::TransferPtr(device->newBuffer(n_elements * sizeof(Coord_f), MTL::ResourceStorageModeManaged)) The headers for metal-cpp do not appear to define vector objects like float2, but I'm doubtless missing something. Thanks.
2
0
656
Jan ’25
Subdivision shows in RealityComposerPro but not when loaded in Simulator
Hello, I am trying to use the subdivision mesh rendering option. I can see it working in RealityComposerPro: But not when loading asset and displaying in Simulator: Using this code: import SwiftUI import RealityKit import RealityKitContent struct AirspaceView: View { // MARK: - VIEW BODY var body: some View { RealityView { content in if let a = try? await Entity(named: "Models/Test/Test.usdc", in: realityKitContentBundle) { content.add(a) } } } } Any ideas why?
2
1
552
Feb ’25
How to apply a geometry modifier to a VideoMaterial in RealityKit?
I have a model that uses a video material as the surface shader and I need to also use a geometry modifier on the material. This seemed like it would be promising (adapted from https://developer.apple.com/wwdc21/10075 ~5m 50s). // Did the setup for the video and AVPlayer eventually leading me to let videoMaterial = VideoMaterial(avPlayer: avPlayer) // Assign the material to the entity entity.model!.materials = [videoMaterial] // The part shown in WWDC: Set up the library and geometry modifier before, so now try to map the new custom material to the video material entity.model!.materials = entity.model!.materials.map { baseMaterial in       try! CustomMaterial(from: baseMaterial, geometryModifier: geometryModifier)     } But, I get the following error Thread 1: Fatal error: 'try!' expression unexpectedly raised an error: RealityFoundation.CustomMaterialError.defaultSurfaceShaderForMaterialNotFound How can I apply a geometry modifier to a VideoMaterial? Or, if I can't do that, is there an easy way to route the AVPlayer video data into the baseColor of CustomMaterial?
2
0
1.1k
1w
VisionOS VideoMaterial on 3D Mesh
I'm trying to get video material to work on an imported 3D asset, and this asset is a USDC file. There's actually an example in this WWDC video from Apple. You can see it running on the flag in this airplane, but there are no examples of this, and there are no other examples on the internet. Does anybody know how to do this? You can look at 10:34 in this video. https://developer.apple.com/documentation/realitykit/videomaterial
2
0
913
1w
Metal calls hanging/stuck if app is started quickly after login
Our app uses Metal for image processing. We have found that if our app (and its possible intensive image processing) is started quickly after user is logged in, then calls to Metal may be hanging/stuck for a good while. Example: it can take 1-2 minutes for something that usually takes 3-5 seconds! Metal threads are just hanging in a memmove... In Activity Monitor we see a lot of things are happening right after log-in. But why Metal calls are blocking for so long is unknown to us... The workaround is to wait a minute before we start our app and start intensive image processing using Metal. But hard to explain this workaround to end-users... It doesn't happen on all computers but fairly easy to reproduce on some computers. We are using macOS 15.3.1. M1/M3 Max. Any good ideas for how to proceed with this problem and possible reach out to Apple engineers? Thanks! :)
2
0
438
Feb ’25
CustomMetalView sample uses deprecated functions - update?
The sample code here, has code like: // Create a display link capable of being used with all active displays cvReturn = CVDisplayLinkCreateWithActiveCGDisplays(&_displayLink); But that function's doc says it's deprecated and to use NSView/NSWindow/NSScreen displayLink instead. That returns CADisplayLink, not CVDisplayLink. Also the documentation for that displayLink method is completely empty. I'm not sure if I'm supposed to add it to run loop, or what, after I get it. It would be nice to get an updated version of this sample project and/or have some documentation in NSView.displayLink
2
0
362
Jul ’25
Why is SKPhysicsBody not picking up alpha?! SpriteKit for a WatchOS game.
So I'm trying to use SpriteKit to make the background of my game. The walls have alpha 1.0, and the safe area alpha 0 and fully transparent. (e.g. a big black square with a smaller transparent square in the middle of it). Yet sprite kit always assume the entire image is either fully opaque or fully transparent. That defies its purpose isn't it? Is there a way to make this work?
2
0
595
Jan ’25
SceneKit minimumPointScreenSpaceRadius not work
In the SceneKit framework, I want to render a point cloud and need to set the minimumPointScreenSpaceRadius property of the SCNGeometryElement class, but it doesn't work. I searched for related issues on the Internet, but didn't get a final solution. Here is my code: - (SCNGeometry *)createGeometryWithVector3Data:(NSData *)vData colorData: (NSData * _Nullable)cData{ int stride = sizeof(SCNVector3); long count = vData.length / stride; SCNGeometrySource *dataSource = [SCNGeometrySource geometrySourceWithData:vData semantic:SCNGeometrySourceSemanticVertex vectorCount:count floatComponents:YES componentsPerVector:3 bytesPerComponent:sizeof(float) dataOffset:0 dataStride:stride]; SCNGeometryElement *element = [SCNGeometryElement geometryElementWithData:nil primitiveType:SCNGeometryPrimitiveTypePoint primitiveCount:count bytesPerIndex:sizeof(int)]; element.minimumPointScreenSpaceRadius = 1.0; // not work element.maximumPointScreenSpaceRadius = 20.0; SCNGeometry *pointCloudGeometry; SCNGeometrySource *colorSource; if (cData && cData.length != 0) { colorSource = [SCNGeometrySource geometrySourceWithData:cData semantic:SCNGeometrySourceSemanticColor vectorCount:count floatComponents:YES componentsPerVector:3 bytesPerComponent:sizeof(float) dataOffset:0 dataStride:stride]; pointCloudGeometry = [SCNGeometry geometryWithSources:@[dataSource, colorSource] elements:@[element]]; } else { pointCloudGeometry = [SCNGeometry geometryWithSources:@[dataSource] elements:@[element]]; } return pointCloudGeometry; }
2
0
121
Apr ’25
How to play Vorbis/OGG files with swift?
Does anyone have a working example on how to play OGG files with swift? I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C\++ to fill the PCM because this method seems to only be available in C\++ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes. I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac) I also tried using Cricket Audio framework below. https://github.com/sjmerel/ck It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue https://github.com/sjmerel/ck/issues/3 Right now I believe every player that can play OGGs on mac is written in Objective-C or C++. Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
2
0
4.1k
Jul ’25
Can't link metal-cpp to Modern Rendering With Metal sample
There is a sample project from Apple here. It has a scene of a city at night and you can move in it. It basically has 2 parts: application code written in what looks like Objective-C (I am more familiar with C++), which inherits from things like NSObject, MTKView, NSViewController and so on - it processes input and all app-related and window-related stuff. rendering code that also looks like Objective-C. Btw both parts are mostly in .mm files (Obj-C++ AFAIK). The application part directly uses only one class from the rendering part - AAPLRenderer. I want to move the rendering part to C++ using metal-cpp. For that I need to link metal-cpp to the project. I did it successfully with blank projects several times before using this tutorial. But with this sample project Xcode can't find Foundation/Foundation.hpp (and other metal-cpp headers). The error says this: Did not find header 'Foundation.hpp' in framework 'Foundation' (loaded from '/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks') Pls help
2
0
849
Feb ’25
Why slower with larger threadgroup memory?
I'm implementing optimized matmul on metal: https://github.com/crynux-ai/metal-matmul/blob/main/metal/1_shared_mem.metal I notice that performance is significantly different with different threadgroup memory set in [computeEncoder setThreadgroupMemoryLength] All other lines are exactly same, the only difference is this parameter. Matmul performance is roughly 250 GFLops if I set 32768 (max bytes allowed on this M1 Max), but 400 GFLops if I set 8192. Why does this happen? How can I optimize it?
2
0
116
Apr ’25
Matchmaking issue on tvOS with GKMatchmakerViewController
Hello, We are working on a real-time 2-player online game targeting multiple Apple devices. The following issue only occurs on tvOS: When selecting matchmaking to connect with another player, the native Game Center interface opens and begins the matchmaking process. Almost immediately, the following log appears in the console, and the matchmaking screen remains indefinitely without completing: Timeout while starting matching with request: <GKMatchRequestInternal 0x30d62f690> { defaultNumberOfPlayers : 0 isLateJoin : 0 localPlayerID : U:bea182d69b85f0839e3958742fbc4609 matchType : 0 maxPlayers : 2 minPlayers : 2 playerAttributes : 4294967295 playerGroup : 1 preloadedMatch : 0 recipientPlayerIDs : <__NSArrayM 0x3034ed5c0> {} recipients : <__NSArrayM 0x3034ee280> {} restrictToAutomatch : 0 version : 1 archivedSharePlayInviteeTokensFromProgrammaticInvite, inviteMessage, localizableInviteMessage, messagesBasedRecipients, properties, queueName, recipientProperties, rid, sessionToken : (null) } . Error: (null) However, the task does not complete when the log appears (our Debug.Log are nerver called). But if we manually cancel the matchmaking process, the "User cancel" log is correctly triggered. Here is a code snippet for the request : var gkMatchRequest = GKMatchRequest.Init(); gkMatchRequest.MinPlayers = 2; gkMatchRequest.MaxPlayers = 2; var matchRequestTask = GKMatchmakerViewController.Request(gkMatchRequest); matchRequestTask.ContinueWith(t => { Debug.LogException(t.Exception); }, TaskContinuationOptions.OnlyOnFaulted); matchRequestTask.ContinueWith(t => { Debug.LogInfo("User cancel"); }, TaskContinuationOptions.OnlyOnCanceled); matchRequestTask.ContinueWith(t => { Debug.LogInfo("Success"); }, TaskContinuationOptions.OnlyOnRanToCompletion); We have tested this on multiple devices and network types (Wi-Fi, 5G, Ethernet), but we consistently encounter this bug along with the same log message. Could you please help us understand or resolve this issue? Thank you.
2
0
160
Apr ’25
Resources for Retro style Games wanting 90 degree Window corners
I've been thinking of bringing some older games back to the modern Mac. Rewriting old titles in Swift but using the original data files that assume use of non-rounded corners Windows. Many of these games require all the Window space of a 90 degree cornered Window. Can anyone point me at some useful workarounds or Is Apple simply deaf to the needs of this type of product?
2
1
451
2w
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
2
0
518
Nov ’25
Physics bug in WWE 2K25 with GPTK2.1
The game physics work as expected using GTPK 2.0 using Crossover 24 or Whisky. However, using GPTK 2.1 with Crossover 25, the player and camera physics misbehave. See https://www.reddit.com/r/WWEGames/comments/1jx9mph/the_siamese_elbow/ and https://www.reddit.com/r/WWEGames/comments/1jx9ow4/camera_glitch/ Full video also linked in the Reddit post. I have also submitted this bug via the feedback assistant.
2
0
193
Apr ’25