Hi,
I am creating an ECS. With this ECS I will need to register several DragGesture.
Question: Is it possible to define DragGestures in ECS? If yes, how do we do that? If not, what is the best way to do that?
Question: Is there a "gesture" method that takes an array of gestures as a parameter?
I am interested in any information that can help me, if possible with an example of code.
Regards
Tof
Post
Replies
Boosts
Views
Activity
Hi,
When I'm looking at the RoomAnchor documentation I can see the planeAnchorIDs property.
My question: How I can get an array of PlaneAnchor with planeAnchorIDs?
A code example would be greatly appreciated.
Regards
Tof
Hi,
I wanted to do something quite simple: Put a box on a wall or on the floor.
My box:
let myBox = ModelEntity(
mesh: .generateBox(size: SIMD3<Float>(0.1, 0.1, 0.01)),
materials: [SimpleMaterial(color: .systemRed, isMetallic: false)],
collisionShape: .generateBox(size: SIMD3<Float>(0.1, 0.1, 0.01)),
mass: 0.0)
For that I used Plane Detection to identify the walls and floor in the room. Then with SpatialTapGesture I was able to retrieve the position where the user is looking and tap.
let position = value.convert(value.location3D, from: .local, to: .scene)
And then positioned my box
myBox.setPosition(position, relativeTo: nil)
When I then tested it I realized that the box was not parallel to the wall but had a slightly inclined angle.
I also realized if I tried to put my box on the wall to my left the box was placed perpendicular to this wall and not placed on it.
After various searches and several attempts I ended up playing with transform.matrix to identify if the plane is wall or a floor, if it was in front of me or on the side and set up a rotation on the box to "place" it on the wall or a floor.
let surfaceTransform = surface.transform.matrix
let surfaceNormal = normalize(surfaceTransform.columns.2.xyz)
let baseRotation = simd_quatf(angle: .pi, axis: SIMD3<Float>(0, 1, 0))
var finalRotation: simd_quatf
if acos(abs(dot(surfaceNormal, SIMD3<Float>(0, 1, 0)))) < 0.3 {
logger.info("Surface: ceiling/floor")
finalRotation = simd_quatf(angle: surfaceNormal.y > 0 ? 0 : .pi, axis: SIMD3<Float>(1, 0, 0))
} else if abs(surfaceNormal.x) > abs(surfaceNormal.z) {
logger.info("Surface: left/right")
finalRotation = simd_quatf(angle: surfaceNormal.x > 0 ? .pi/2 : -.pi/2, axis: SIMD3<Float>(0, 1, 0))
} else {
logger.info("Surface: front/back")
finalRotation = baseRotation
}
Playing with matrices is not really my thing so I don't know if I'm doing it right.
Could you tell me if my tests for the orientation of the walls are correct? During my tests I don't always correctly identify whether the wall is in front or on the side.
Is this generally the right way to do it?
Is there an easier way to do this?
Regards
Tof
Hi,
On visionOS to manage entity rotation we can rely on RotateGesture3D. We can even with the constrainedToAxis parameter authorize only rotation on an x, y or z axis or even make combinations.
What I want to know is if it is possible to constrain the rotation on axis automatically.
Let me explain, the functionality that I would like to implement is to constrain the rotation on an axis only once the user has started his gesture. The initial gesture the user makes should let us know which axis they want to rotate on.
This would be equivalent to activating a constraint automatically on one of the axes, as if we were defining the gesture on one of the axes.
RotateGesture3D(constrainedToAxis: .x)
RotateGesture3D(constrainedToAxis: .y)
RotateGesture3D(constrainedToAxis: .z)
Is it possible to do this?
If so, what would be the best way to do it?
A code example would be greatly appreciated.
Regards
Tof
Hi,
In a visionOS application I have an entity that is displayed. I can give it a certain velocity by making it collide with another entity.
I would also like to be able to drag the entity and give it a certain velocity via the drag.
I searched in the project examples and found nothing. I also searched on the Internet without finding anything clear on the subject.
Looking at the drag gesture information I found gestureValue.velocity but I have no idea how to use this property. I'm not even sure if this property is useful to me because it's a CGSize so, a priori, not intended for a 3D gesture.
If you have any information that will help me implement what I am trying to do I would be grateful. 🙏🏻
DragGesture()
.targetedToAnyEntity()
.onChanged {
pValue in
// Some code
}
.onEnded {
pValue in
//pValue.gestureValue.velocity
}
Hi,
I manage apps for several clients.
I have 2 apps for 2 different clients in which I use OneSignal version 5.1.5 (https://github.com/OneSignal/OneSignal-XCFramework).
The OneSignal part of these 2 applications are configured identically.
For both applications I can
Build the project
Create an archive of the project
Send the archive to TestFlight
Install the TestFlight version on an iPhone
Launch the app on the iPhone
For the 1st application I can submit to AppStore it without any problem.
For the 2nd application when I submit to AppStore it I have the error ITMS-91065 almost immediately in return by email.
ITMS-91065: Missing signature - Your app includes “Frameworks/OneSignalCore.framework/OneSignalCore”, which includes OneSignalCore, an SDK that was identified in the documentation as a privacy-impacting third-party SDK. If a new app includes a privacy-impacting SDK, or an app update adds a new privacy-impacting SDK, the SDK must include a signature file. Please contact the provider of the SDK that includes this file to get an updated SDK version with a signature. For details about verifying the code signature for a third-party SDK, visit: https://developer.apple.com/documentation/xcode/verifying-the-origin-of-your-xcframeworks.
ITMS-91065: Missing signature - Your app includes “Frameworks/OneSignalExtension.framework/OneSignalExtension”, which includes OneSignalExtension, an SDK that was identified in the documentation as a privacy-impacting third-party SDK. If a new app includes a privacy-impacting SDK, or an app update adds a new privacy-impacting SDK, the SDK must include a signature file. Please contact the provider of the SDK that includes this file to get an updated SDK version with a signature. For details about verifying the code signature for a third-party SDK, visit: https://developer.apple.com/documentation/xcode/verifying-the-origin-of-your-xcframeworks.
ITMS-91065: Missing signature - Your app includes “Frameworks/OneSignalOutcomes.framework/OneSignalOutcomes”, which includes RxSwift, an SDK that was identified in the documentation as a privacy-impacting third-party SDK. If a new app includes a privacy-impacting SDK, or an app update adds a new privacy-impacting SDK, the SDK must include a signature file. Please contact the provider of the SDK that includes this file to get an updated SDK version with a signature. For details about verifying the code signature for a third-party SDK, visit: https://developer.apple.com/documentation/xcode/verifying-the-origin-of-your-xcframeworks.
I checked and I had no errors on my project of the 2nd application.
I don't understand why the framework is accepted with the 1st application and refused with the 2nd application.
My config: macOS 14.5 (23F79), Xcode 15.4 (15F31d)
Has anyone ever had this kind of problem?
Regards
Tof
Hi,
I can't recompile a project for visionOS with the latest beta.
It makes me the error :
Value of type 'WorldTrackingProvider' has no member 'queryPose'
In the online doc queryPose still exists and Xcode does not indicate which API to use instead of queryPose.
Does anyone have an idea of the API that should now be used instead of queryPose?
Hi,
In the visionOS documentation
Positioning and sizing windows - Specify initial window position
In visionOS, the system places new windows directly in front of people, where they happen to be gazing at the moment the window opens.
Positioning and sizing windows - Specify window resizability
In visionOS, the system enforces a standard minimum and maximum size for all windows, regardless of the content they contain.
The first thing I don't understand is why it talk about macOS in visionOS documentation.
The second thing, what is this page for if it's just to tell us that on visionOS we have no control over the position and size of 2D windows. Whereas it is precisely the opposite that would be interesting. I don't understand this limitation. It limits so much the use of 2D windows under visionOS.
I really hope that this limitation will disappear in future betas.
Hi,
The video of "Take SwiftUI to the next dimension" is interesting but the code snippets that are provided are not enough.
Would it be possible to have the complete project associated with this video?
In general, it would be very useful to have the complete projects of all the videos on "Spacial Computing".
I have a ModelEntity object that I would like to convert to a Model3D.
Is it possible?
If yes, how should I go about it?
I followed the steps described in: Setting up access to ARKit data
It does not work.
it gives me the error: No Immersive Space with id 'appSpace' is defined
Can PlaneDetectionProvider work in the simulator?
Can ARKit generally work in the simulator?
I created a window with the following code
WindowGroup {
MainView()
.ornament(attachmentAnchor: .scene(alignment: .bottom)) {
OrnementView(title: "Bottom")
}
.ornament(attachmentAnchor: .scene(alignment: .top)) {
OrnementView(title: "Top")
}
.ornament(attachmentAnchor: .scene(alignment: .leading)) {
OrnementView(title: "Leading")
}
.ornament(attachmentAnchor: .scene(alignment: .trailing)) {
OrnementView(title: "Trailing")
}
.frame(minWidth: 100, maxWidth: 400, minHeight: 100, maxHeight: 400)
.border(.green)
}
It shows me this
When I resize my Window the ornaments do not follow the Window
Hi,
Since yesterday (May 11, 2023) I can no longer send an update on TestFlight for the Apple Developer account of a client on which I am admin. It gives me the following error:
The last time I sent a version to TestFlight was May 4, 2023 and I had no problem.
With other client applications I have no problem sending my updates to TestFlight.
I have no problem connecting to Apple Connect with my Apple ID, I can see that I am an admin on the Apple Developer account.
My config:
macOS 13.3.1 (a)
Xcode Version 14.3 (14E222b)
Regards
Tof
Hi,
I use MacVM to install Ventura on a VM.
With macOS Ventura beta 2 I have several problems that I would have liked to report via the "Feedback assistant" application.
The problem is that the application refuses my Apple ID and display the message "An error occurred during authentication."
To verify my Apple ID I logged into my developer account via Safari. I was able to do it with no problem.
Hi,
After updating Xcode to version 13.3 I have icloud which showed me the following screen
I also tried to update macOS 12.3 and I tried several times I always have this screen
Now when I start my mac it systematically asks me to reconnect to iCloud
This is the first time I have this after an Xcode update!
Any help will be appreciated
(my config MacBook Pro (16-inch, 2021) M1 Max)