Create immersive Unity apps

RSS for tag

Discuss the WWDC23 Session Create immersive Unity apps

View Session

Posts under wwdc2023-10088 tag

10 Posts
Sort by:
Post not yet marked as solved
0 Replies
501 Views
Hello, I am on windows and I am an apple developer. Quite the match right. Well, I have a huge bug on my unity game and I received a crash log file. But, only one problem. I can't understand it. I do not have xcode. I have the file itself and it says it's a bugtype 309. I can't symbolicate it either. Anyone have any suggestions? Thank you. here's a little snippet, if anyone can help me out. "ime" : 7500, "procRole" : "Foreground", "version" : 2, "userID" : 501, "deployVersion" : 210, "modelCode" : "iPhone11,8", "coalitionID" : 916, "osVersion" : { "isEmbedded" : true, "train" : "iPhone OS 17.0", "releaseType" : "User", "build" : "21A329" }, "captureTime" : "2023-09-19 09:56:34.6392 -0700", "codeSigningMonitor" : 1, "incident" : "A3F5FA70-7C68-498A-A534-A55E23E88263", "pid" : 903, "cpuType" : "ARM-64", "roots_installed" : 0, "bug_type" : "309", "procLaunch" : "2023-09-19 09:56:15.5922 -0700", "procStartAbsTime" : 180831869743, "procExitAbsTime" : 181288438801, "procName" : "El", "procPath" : "/private/var/containers/Bundle/Application/EC7A3162-12FE-4340-B40A-172D8C554969/EndlessFall.app/EndlessFall", "bundleInfo" : {"CFBundleShortVersionString":"1.9","CFBundleVersion":"0.9.18","CFBundleIdentifier":"com.Bl","DTAppStoreToolsBuild":"1a"}, "storeInfo" : {"itemID":"641","deviceIdentifierForVendor":"5557A5A3-438E-4341-9D8D-3256CE7DDCCD","thirdParty":true,"softwareVersionExternalIdentifier":"859958770"}, "parentProc" : "launchd", "parentPid" : 1,"
Posted
by unled.
Last updated
.
Post not yet marked as solved
0 Replies
597 Views
In the videos, it's mentioned that depth is required for all rendered pixels. Is transparency not supported? Specifically, regarding URP shaders, is it that transparency is only supported for fully immersive? (or is it not supported at all)
Posted Last updated
.
Post not yet marked as solved
0 Replies
766 Views
unlit is mentioned together with occlusion (passthrough), which makes me suspect it might be a pre-built shader. as a workaround, it's of course possible to make something close to unlit using PBR emission, but that would probably be less efficient...
Posted Last updated
.
Post not yet marked as solved
1 Replies
580 Views
According to the video, the immersive app rendering is taken controlled by RealityKit. Does it mean that the render pipeline is unable to be customized? For example, RendererFeature or modify the render pipeline directly. For custom shaders, it seems that the shader graph is converted to MaterialX. The standard surface shader for MaterialX supports Subsurface and Coat. Does RealityKit support them? If so, How can these properties mapped by shader graph while URP Lit shader graph is without the built-in support for Subsurface or Coat?
Posted
by SetoKaiba.
Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
Hi, I have a question about Apple Vision Pro's support for Unity programmable shaders. Shaders applied to Material are not supported. RenderTextures are supported. (Can be used as texture input to Shader Graph for display through RealityKit.) Regarding the above, are Shared Space, Full Space, and Full immersive space all covered? Is Full immersive space irrelevant because it is Metal and not RealityKit? Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
Posted Last updated
.
Post not yet marked as solved
0 Replies
576 Views
Hi, I am currently watching the Create immersive Unity apps video from WWDC23. I am posting this question because a question arose while watching the video. First, look at the following, which is explained in the session Because you're using Unity to create volumetric content that participates in the shared space, a new concept called a volume camera lets you control how your scene is brought into the real world. A volume camera can create two types of volumes, bounded and unbounded, each with different characteristics. Your application can switch between the two at any time. https://developer.apple.com/videos/play/wwdc2023/10088/?time=465   Your unbounded volume displays in a full space on this platform and allows your content to fully blend with passthrough for a more immersive experience. https://developer.apple.com/videos/play/wwdc2023/10088/?time=568 At first, we explain that there are two types of volumetric content in Shared Space: bounded volume and unbounded volume. However, when we get to the description of unbounded volume, it is changed to Full Space. Is Full Space correct for unbounded volume, not Shared Space? Best regards. P.S. I felt uncomfortable with the title Create immersive Unity apps. The first half of the presentation was about Unity development and Shared Space's Bounded Volume, and I felt that Bounded Volume apps are far from immersive. Apple's definition of immersive in spatial computing was vague. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
Do I understand correctly that to use Unity (Universal Render Pipeline) for Vision Pro's fully immersive apps, we can normally use Unity's Shader Graph to make custom shaders; but for immersive mixed reality apps, we cannot use Shader Graph anymore, instead, we have to create shaders for Unity in Reality Composer? How does bringing Reality Composer's shader into Unity work? Is it simply working in Unity or will it require special adaptation for Unity? Are there some cases to avoid using Reality Composer, and use Unity's Shader Graph for immersive Vision apps? For instance, we may lose real-time lighting adaptation for virtual objects, but on the other hand, we will be able to use Shader Graph.
Posted
by NikitaSh.
Last updated
.
Post not yet marked as solved
0 Replies
903 Views
On VisionOS, is a combination of full passthrough, unbounded volumes, and my own custom 3D rendering in Metal Possible? According to the RealityKit and Unity VisionOS talk, towards the end, it’s shown that an unbounded volume mode allows you to create full passthrough experiences with graphics rendering in 3D — essentially full 3D AR in which you fan move around the space. It’s also shown that you can get occlusion for the graphics. This is all great, however, I don’t want to use RealityKit or Unity in my case. I would like to be able to render to an unbounded volume using my own custom Metal renderer, and still get AR passthrough and the ability to walk around and composit virtual graphical content with the background. To reiterate, this is exactly what is shown in the video using Unity, but I’d like to use my own renderer instead of Unity or RealityKit. This doesn’t require access to the video camera texture, which I know is unavailable. Having the flexibility to create passthrough mode content in a custom renderwr is super important for making an AR experience in which I have control over rendering. One use case I have in mind is: Wizard’s Chess. You see the real world and can walk around a room-size chessboard with virtual chess pieces mixed with the real world, and you can see the other player through passthrough as well. I’d like to render graphics on my living room couches using scene reconstruction mesg anchors, for example, to change the atmosphere. The video already shows several nice use cases like being able to interact with a tabletop fantasy world with characters. Is what I’m describing possible with Metal? Thanks! EDIT: Also, if not volumes, then full spaces? I don’t need access to the camera images that are off-limits. I would just like passthrough + composition with 3D Metal content + full ARKit tracking and occlusion features.
Posted Last updated
.
Post not yet marked as solved
0 Replies
467 Views
I am interested in experimenting with different algorithms to support people with color vision deficiency. Can shaders and/or color matrix transformations be applied to the camera images themselves via Unity and, if so can they be applied to each eye independently? Thanks, Tom
Posted
by tpntpn.
Last updated
.