I'm trying to add support to PS5 DualSense controller. when I try to use the API from here: https://developer.apple.com/documentation/gamecontroller/gcdualsenseadaptivetrigger?language=objc None of the API works, am I missed anything? The code is like this: if ( [ controller.extendedGamepad isKindOfClass:[ GCDualSenseGamepad class ] ] ) { GCDualSenseGamepad * dualSenseGamePad = ( GCDualSenseGamepad * )controller.extendedGamepad; auto funcSetEffectTrigger = []( TriggerEffectParams& params, GCDualSenseAdaptiveTrigger *trigger ) { if ( params.m_mode == TriggerEffectMode::Off ) { [ trigger setModeOff ]; NSLog(@setModeOff trigger.mode:%d, trigger.mode ); } else if ( params.m_mode == TriggerEffectMode::Feedback ) { [ trigger setModeFeedbackWithStartPosition: 0.2f resistiveStrength: 0.5f ]; } else if ( params.m_mode == TriggerEffectMode::Weapon ) { [ trigger setModeWeaponWithStartPosition: 0.2f endPosition: 0.4f resistiveStrength: 0.5f ]; } else if ( params.m_mode == TriggerEffectMode::Vibration ) { [ t
missing package product
45,842 results found
Post
Replies
Boosts
Views
Activity
I solved the issue by ensuring that only the aggregate target was compiled using the -configuration option – though there is still a problem with how Xcode behaves if one does not include this parameter. I changed my build command from this: xcodebuild -configuration Release -project AggregateProject.xcodeproj to this: xcodebuild -configuration Release -scheme Build All Projects -project AggregateProject.xcodeproj After compiling this way, all of the Products ended up in the default Derived Data location ( ~/Library/Developer/Xcode/DerivedData/AggregateProject-blahblah/Build/Products/Release) as expected, and no build folders were created next to the subproject's .xcodeproj files. Still, I don't understand why the build location settings of the subprojects, and Xcode's global setting, are not respected when you compile without the -configuration option. Maybe somebody has an idea. Best Wishes, Mark
Hello, We submitted our app on TestFlight, but received an automated response with the following error: ITMS-90426: Invalid Swift Support - The SwiftSupport folder is missing. Rebuild your app using the current public (GM) version of Xcode and resubmit it. Our app is developed entirely in Objective-C, and we’re unsure why it’s looking for SwiftSupport. Despite attempting several potential solutions, the error persists. Could someone please assist us with understanding why this is happening and advise on how to proceed with the submission? Any guidance would be greatly appreciated. Thank you for your help.
This happens with me as well, with multiple templates with the Xcode 16.2 package.
Our app support English and Traditional Chinese only, so the Xcode config and the app store setting include these 2 languages only now. However, the support languages displayed at the App Store show our app support Simplified Chinese. Would like to know is there any config we missed or wrong setting we have done? Appreciate for any reply or suggestion.
Since updating to M4 Pro MBP running MacOS 15.2 and now 15.3 beta, predictive text and auto completion does not show in any apps on my computer. This all worked on previous M3 Pro MBP. Predictive text/auto completion is not working in any Microsoft apps either, and my settings are missing certain elements that my wife's M3 MBP show in Outlook. Any help would be appreciated.
I just purchased a new 2025 Honda Civic Hybrid sedan with the highest trim package. The staff at the dealership set CarPlay up to my iPhone 16 Pro, and all was operating perfectly. Then, last week, I started noticing random connectivity problems with it (ie: no sound from my audio apps, “not connected” being displayed on the dashboard displa, etc.) I tried to think of what had changed with my setup, and the only change was that I updated to the latest iOS update 18.2 I scheduled a service appointment with the Honda dealer in hopes that Apple and Honda can confirm a fix for this issue. I’ll try to attach an image from my car’s dashboard display as an example of an error message that isn’t resolved.
I've got a web app built with MusicKit that displays a list of songs. I have player controls for play, pause, skip next, skip, previous, toggle shuffle and set repeat mode. All of these work by using music. The play button, when nothing is playing and nothing is in the queue, will enqueue all the tracks and start playing with the below, for example: await music.setQueue({ songs, startPlaying: true }); I've implemented a progress slider based on feedback from the playbackProgressDidChange listener. Now, how in the world can I set the volume? This seems like it should be simple, but I am at a complete loss here. The docs say: The volume of audio playback, which is set directly on the HTMLMediaElement as the HTMLMediaElement.volume property. This value ranges between 0, which would be muting the audio, and 1, which would be the loudest possible. Given that all my controls work off the music instance, I don't understand how I can do that. In this video from WWDC 2022, music web components are touched on
Recently, we received a rejection for our app on the App Store under Guideline 4.1 - Design - Copycats. We strongly disagree with this decision, as we are confident in the uniqueness of our project. We have submitted an appeal regarding this rejection, but so far, we have not received any confirmation that it has been reviewed or accepted for reconsideration. The explanation provided for the rejection was vague, offering only general statements and references without specific details. This makes it challenging for us to understand which aspects of our app are causing concerns. Here’s what we’ve done so far to address the feedback: Changed the Original Name: We renamed our app to emphasize its uniqueness and avoid any potential associations with other franchises. Removed the Character from Screenshots: We deleted the image from our screenshots to prevent any misunderstandings and to highlight the originality of our visual content. Replaced the Icon: We completely redesigned the app icon, creating a new and uni
Everything works fine, except when tapping the navigation Back link and returning to the previous view, the AR session inside RealityView does not terminate. The green dot camera indicator stays on, it is still scanning the environment, and if the package has audio in it, the audio will still play, albeit extremely panned on the right channel. I have no issues terminating QuickLook or ARSCNView. I have a simple NavigationLink opening the RealityView... NavigationLink(destination: MyRealityView()) { Text(Open AR) } struct MyRealityView : View { var body: some View { RealityView { content in // Create horizontal plane anchor for the content let anchor = AnchorEntity(.plane(.horizontal, classification: .floor, minimumBounds: [0.5,0.5])) let scene = await loadEntity(named: Scene) // Add model to anchor anchor.addChild(scene!) content.add(anchor) // View Settings content.camera = .spatialTracking } placeholder: { ProgressView() } .onDisappear { //print(RealityView is disappearing. Cleanup actions here.) }
I am developing an app with support for In-App Purchases (IAP) for consumable products using StoreKit. I have defined the products in ProductList.plist and Product.storekit, but I am unable to connect them correctly to App Store Connect. Here are the details: Products defined in ProductList.plist: Bolet Evento Vip: com.cover.boleto.vip Boleto Evento Básico: com.cover.boleto.basico Configuration in Product.storekit: The products have prices and basic configurations, but they do not seem to link properly in App Store Connect. Steps I have taken: Configured IAP simulation in Xcode. Attempted to register the products in App Store Connect. Issues I am facing: The products are not appearing in App Store Connect after configuration. My app cannot seem to fetch consumable products from App Store Connect. Question: What steps should I follow to correctly register consumable products in App Store Connect and connect the app with StoreKit for production
Was working ok, then after my latest attempt at submitting a .PKG it crashes right after displaying the list of previous submitted things. This make it impossible to make progress with my project, unless there's another way of getting it to TestFlight (not using Xcode). Transporter 1.3.2 on Intel Mac Sequoia 15.0
Hello! Few month ago i did get hacked on my pc and then my android and iphone. Did get at notice that payments couldent draw. lucky I only had 240kr on lunar card that it did draw 200kr to a gift card. Did get mail from skrill that a account whas created with one of my Gmail’s. Tryed to log them out but window did keep close. Gmail did flag like crazy and wanted me to change pw. how the **** when I lost control of my phone?!?!??! Just lock it god Damn. let’s make it short! I shared network to pc from my phone With usb. I don’t just think it whas a attacker program as Gmail did flag. I think I did get mirror linked on my android and maybe my iphone. Had a real struggle to reset my pc and phones before it worked. My iPhone drains battery like crazy and feels laggy sometimes. A non registered number whas added to two Gmail’s that they did try to change pw multiple times. did notice I Linux pc activity on my fb and some other stuff. My iphone do reboot still sometimes and every second reboot wifi/bluet c
I wanted to update this post with resources I found. It appears the automation for persistent anchors and world data maps has been configured as WorldAnchors. Currently, it looks like this is only supported in visionOS. https://developer.apple.com/documentation/visionos/tracking-points-in-world-space It appears that by simply adding a WorldAnchor that visionOS automatically tracks the world map, unloading and loading based on your location automatically in the background. This is amazing. Though, I'm not sure why this wouldn't be supported on iOs and iPadOS as well. Perhaps in the future it will be implemented as a core ARKit feature as well. To the best of my limited knowledge, it appears we will have to continue to use the previous methods for persistent data, which can be found here: https://developer.apple.com/documentation/arkit/arkit_in_ios/data_management/saving_and_loading_world_data However, I still have to try and implement this with RealityView. As it is my understanding that only RealityView suppo
I recently detected a special crash on 18.0, 18.1, 18.1.1, 18.2,18.3 which cannot be repeated, and the page logs are related to the keyboard, is there any idea to deal with this problem? Exception Category: nsexception Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x00000000 at 0x0000000000000000 Crashed Thread: 0 CrashDoctor Diagnosis: Application threw exception NSInternalInconsistencyException: Multi layer delegate table missing. Thread 0 Crashed: 0 CoreFoundation 0x00000001869d87cc __exceptionPreprocess + [ : 164] 1 libobjc.A.dylib 0x0000000183cab2e4 objc_exception_throw + [ : 88] 2 Foundation 0x0000000185da88d8 _userInfoForFileAndLine 3 UIKitCore 0x0000000189e78074 -[UIView _multiLayerDelegatesTableCreateIfNecessary:] + [ : 208] 4 UIKitCore 0x0000000189e780c4 -[UIView _registerMultiLayerDelegate:] + [ : 36] 5 UIKitCore 0x00000001894874c0 -[_UIPortalView setSourceView:] + [ : 132] 6 UIKitCore 0x000000018a1eb6bc -[_UIPortalView initWithSourceView:] + [ : 68] 7 UIKitCore 0x000000018a213ea4 -