Search results for

“Visual Studio Maui IOS”

109,084 results found

Post

Replies

Boosts

Views

Activity

Reply to iOS 18 App Intents while supporting iOS 17
Hi @wingover, Thanks for the suggestion! Can you elaborate more on what my AppShortcutsBuilder should look like after I up the target to 17.4? For Example, the following code crashes on iOS 17.5 with a 17.5 or 17.4 target: struct SnippetsShortcuts: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut(intent: CreateAppIntent(), phrases: [ Create a (.$selection) in (.applicationName) Studio, Create a (.$selection) in (.applicationName), Create a (.$selection) ], shortTitle: Create, systemImageName: plus) if #available(iOS 18, *) { AppShortcut(intent: SearchAppIntent(), phrases: [ Search (.applicationName) Studio, Search (.applicationName) ], shortTitle: Search, systemImageName: magnifyingglass) } } let shortcutTileColor: ShortcutTileColor = .blue } Here is the crash message: dyld[23602]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: /Users/jonathan/Library/Developer/CoreSimulator/Devices/2E
Topic: Machine Learning & AI SubTopic: General Tags:
Jul ’24
Reply to Reducing lag by configuring RoomPlan to scan only windows and doors
Hey @ChrisE96, I am assuming that you are building this on iOS, is that correct? I ask as you chose the Spatial Computing topic which is focused on Apple Vision Pro development. Assuming you're building on iOS, have you considered using ARKit instead of RoomPlan to detect planes? The sample code Tracking and Visualizing Planes is a great place to start. You can use ARPlaneAnchor.Classification to characterize each surface. Let me know if that helps, Michael
Topic: Spatial Computing SubTopic: ARKit Tags:
Aug ’24
Reply to Beta 3 - Visual Voicemail
I just went to O2 and checked how to switch it on - switch divert to Visual Voicemail on dial 1750 free from your iPhone and to switch it off dial 1760. Once I did this, I then called my number whilst on plane mode, left a vm, then I went went back off of Plane Mode waited a few seconds and visual voicemail has returned.. so it looks like whilst updating to Beta 3 visual voicemail gets switched off at the network level, so try this with your carrier to see if it fixes your issues..
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’17
Mac app Audio Recording with Visualizations best approach
Hey I wonder if someone can point me in the right direction. I am building a Mac Audio App that I need to perform the following actions Select audio input device Show a live audio graph of device input (think GarageBand/ Logic) Capture the audio for replay Output the sound or make itself selectable as an input device for another app - Similar to how plugins work for GarageBand Logic etc I have looked into using the following so far: AudioKit AVFoundation / AVCaptureDevice AudioKit This framework looks amazing and has sample apps for most of the things I want to do, however it seems that it will only accept the audio input that the Mac has selected in settings. This is a non starter for me as I want the user to be able to chose in App (like GarageBand does or Neural Dsp plugins) Using AudioEngine I can get the available input devices, but everything I have found points to them not being changeable in App - here's code to display them struct InputDevicePicker: View { @State var device: Device var engine: AudioE
0
0
627
Jan ’23
Smart Adaptive Volume & Brightness - Say Goodbye to Noise & Visual Pollution!
Hello everyone in the iOS Devolution community! I'd like to share a suggestion that I believe would bring an unprecedented level of intelligence and comfort to the daily iPhone experience: Smart Adaptive Volume & Brightness. The Problem We Aim to Solve How many times has your iPhone rung too loudly in a quiet environment, embarrassing you in a meeting or waking someone up? Or, the opposite, you missed an important call on a busy street because the volume was too low? And what about screen brightness? It's a constant adjustment: too bright in the dark, hard to see in the sun. Currently, we have to manually adjust volume and brightness, or rely on Auto-Brightness (which only works for the screen) and Focus modes, which can be a bit all or nothing. This leads to interruptions, frustration, and that feeling that your phone isn't really adapting to you. The Solution: Smart Adaptive Volume & Brightness My proposal is for iOS to use the iPhone's own sensors to dynamically adapt notification
1
0
96
Jun ’25
Problem in logo display of Apple Pay Later visual merchandising widget only on Safari browser
While implementing Apple Pay Later visual merchandising widget on my website it is displaying on every browser except safari without debug attribute. On safari it only works when debug attribute is set to be true, when we implemented custom css for display property value as inline-block it is showing none only in safari. Is there any other approach other than debug=true for safari, as we can not use debug property in production.
2
0
735
Jul ’23
Mac Studio Ventura panic / boot loop with kext
So we have produced kexts that run well, on Intel and Arm64, on (for example) an MBP/M1 (all macOS currently available), and MacStudio (Monterey). But on Monterey + Ventura it enters a boot panic loop. One example is: build : macOS 13.1 (22C5033e), product : Mac13,2, socId : 0x00006002, kernel : Darwin Kernel Version 22.2.0: Sun Oct 16 18:09:52 PDT 2022; root:xnu-8792.60.32.0.1~11/RELEASE_ARM64_T6000, incident : 8D3814E3-DCBB-42A6-AACF-C37F66D6BBC8, crashReporterKey : FF922DC9-99E1-68B9-75FB-9427F2BBF431, date : 2022-10-28 00:12:53.22 +0100, panicString : panic(cpu 6 caller 0xfffffe001e4b11e8): apciec[pcic2-bridge]::handleInterrupt: Request address is greater than 32 bits linksts=0x99000001 pcielint=0x02220060 linkcdmsts=0x00000000 (ltssm 0x11=L0)n @AppleT8103PCIeCPort.cpp:1301n Debugger message: panicnMemory ID: 0x6nOS release type: UsernOS version: 22C5033enKernel version: Darwin Kernel Version 22.2.0: Sun Oct 16 18:09:52 PDT 2022; root:xnu-8792.60.32.0.1~11/RELEASE_ARM64_T6000nFileset Kernelcache UUID: D76
1
0
1.2k
Oct ’22
Reply to Help Positioning Objects around another Object
the cheatiest way to do this would be to use physics.Make spheres with an associated physics body of exactly the same size as themselves, and attempt to place them. If they intersect, take a look at where they're intersecting and make an effort to determine what size fits in the available space.But I don't really understand the auto place the objects around the sphere line, as this seems to be the criteria process for where to place them, but not explicit enough for me to figure out what you're trying to do in terms of placement.I do a LOT of 3D work in 3D apps (design apps) so I have a lot of ways of thinking about how to fit things into 3D space, and describe how to do that, much of which is far hackier and dodgier than the riguour programmers use 😉However... it mostly works, most often.Scene Kit is basically a very primitive 3D app, kind of like using 3D Studio 2, from the early 1990's.But programmable. Which is where I get lost 😀Just let me know how/what you're trying to do, and I'll suggest a
Topic: Graphics & Games SubTopic: SceneKit Tags:
Oct ’15
Reply to RealityKit's ARView raycast returns nothing
Use Debug Visualizations in Xcode to confirm the collision shapes are correct. Click the small Debug Visualizations icon (to the right of the step out icon in the debug tools) and check the collision shapes option. If the shapes look incorrect, try to generate the shapes using generateStaticMesh(from:) instead of generateCollisionShapes.
Topic: Spatial Computing SubTopic: ARKit Tags:
Jun ’24
Reply to iOS 18 App Intents while supporting iOS 17
Hi @wingover, Thanks for the suggestion! Can you elaborate more on what my AppShortcutsBuilder should look like after I up the target to 17.4? For Example, the following code crashes on iOS 17.5 with a 17.5 or 17.4 target: struct SnippetsShortcuts: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut(intent: CreateAppIntent(), phrases: [ Create a (.$selection) in (.applicationName) Studio, Create a (.$selection) in (.applicationName), Create a (.$selection) ], shortTitle: Create, systemImageName: plus) if #available(iOS 18, *) { AppShortcut(intent: SearchAppIntent(), phrases: [ Search (.applicationName) Studio, Search (.applicationName) ], shortTitle: Search, systemImageName: magnifyingglass) } } let shortcutTileColor: ShortcutTileColor = .blue } Here is the crash message: dyld[23602]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: /Users/jonathan/Library/Developer/CoreSimulator/Devices/2E
Topic: Machine Learning & AI SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jul ’24
Reply to How can I restore an app previously removed from sale?
You can contact Apple for TestFlight to get help. — WindowsMEMZ @ Darock Studio let myEmail = memz + 1 + @ + darock.top
Replies
Boosts
Views
Activity
Oct ’24
Reply to Reducing lag by configuring RoomPlan to scan only windows and doors
Hey @ChrisE96, I am assuming that you are building this on iOS, is that correct? I ask as you chose the Spatial Computing topic which is focused on Apple Vision Pro development. Assuming you're building on iOS, have you considered using ARKit instead of RoomPlan to detect planes? The sample code Tracking and Visualizing Planes is a great place to start. You can use ARPlaneAnchor.Classification to characterize each surface. Let me know if that helps, Michael
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Aug ’24
Reply to Beta 3 - Visual Voicemail
I just went to O2 and checked how to switch it on - switch divert to Visual Voicemail on dial 1750 free from your iPhone and to switch it off dial 1760. Once I did this, I then called my number whilst on plane mode, left a vm, then I went went back off of Plane Mode waited a few seconds and visual voicemail has returned.. so it looks like whilst updating to Beta 3 visual voicemail gets switched off at the network level, so try this with your carrier to see if it fixes your issues..
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Jul ’17
Reply to Using a UIButton to enable or disable other UIButtons
Sorry was supposed to have a .png file for visuals
Replies
Boosts
Views
Activity
Nov ’18
Reply to Airdrop Files Not Showing Up in Downloads Folder
Same issue for about 3 months now even after several updates. M2 MBAir works, M1 Max Studio doesnt...
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Jul ’23
Reply to Xcode Randomly Displays String Catalog as XML
Hold Control(^) then click the String Catalog file in the Project Navigator. Choose Open As... -> String Catalog. — WindowsMEMZ @ Darock Studio
Replies
Boosts
Views
Activity
Oct ’24
Mac app Audio Recording with Visualizations best approach
Hey I wonder if someone can point me in the right direction. I am building a Mac Audio App that I need to perform the following actions Select audio input device Show a live audio graph of device input (think GarageBand/ Logic) Capture the audio for replay Output the sound or make itself selectable as an input device for another app - Similar to how plugins work for GarageBand Logic etc I have looked into using the following so far: AudioKit AVFoundation / AVCaptureDevice AudioKit This framework looks amazing and has sample apps for most of the things I want to do, however it seems that it will only accept the audio input that the Mac has selected in settings. This is a non starter for me as I want the user to be able to chose in App (like GarageBand does or Neural Dsp plugins) Using AudioEngine I can get the available input devices, but everything I have found points to them not being changeable in App - here's code to display them struct InputDevicePicker: View { @State var device: Device var engine: AudioE
Replies
0
Boosts
0
Views
627
Activity
Jan ’23
Smart Adaptive Volume & Brightness - Say Goodbye to Noise & Visual Pollution!
Hello everyone in the iOS Devolution community! I'd like to share a suggestion that I believe would bring an unprecedented level of intelligence and comfort to the daily iPhone experience: Smart Adaptive Volume & Brightness. The Problem We Aim to Solve How many times has your iPhone rung too loudly in a quiet environment, embarrassing you in a meeting or waking someone up? Or, the opposite, you missed an important call on a busy street because the volume was too low? And what about screen brightness? It's a constant adjustment: too bright in the dark, hard to see in the sun. Currently, we have to manually adjust volume and brightness, or rely on Auto-Brightness (which only works for the screen) and Focus modes, which can be a bit all or nothing. This leads to interruptions, frustration, and that feeling that your phone isn't really adapting to you. The Solution: Smart Adaptive Volume & Brightness My proposal is for iOS to use the iPhone's own sensors to dynamically adapt notification
Replies
1
Boosts
0
Views
96
Activity
Jun ’25
Problem in logo display of Apple Pay Later visual merchandising widget only on Safari browser
While implementing Apple Pay Later visual merchandising widget on my website it is displaying on every browser except safari without debug attribute. On safari it only works when debug attribute is set to be true, when we implemented custom css for display property value as inline-block it is showing none only in safari. Is there any other approach other than debug=true for safari, as we can not use debug property in production.
Replies
2
Boosts
0
Views
735
Activity
Jul ’23
Mac Studio Ventura panic / boot loop with kext
So we have produced kexts that run well, on Intel and Arm64, on (for example) an MBP/M1 (all macOS currently available), and MacStudio (Monterey). But on Monterey + Ventura it enters a boot panic loop. One example is: build : macOS 13.1 (22C5033e), product : Mac13,2, socId : 0x00006002, kernel : Darwin Kernel Version 22.2.0: Sun Oct 16 18:09:52 PDT 2022; root:xnu-8792.60.32.0.1~11/RELEASE_ARM64_T6000, incident : 8D3814E3-DCBB-42A6-AACF-C37F66D6BBC8, crashReporterKey : FF922DC9-99E1-68B9-75FB-9427F2BBF431, date : 2022-10-28 00:12:53.22 +0100, panicString : panic(cpu 6 caller 0xfffffe001e4b11e8): apciec[pcic2-bridge]::handleInterrupt: Request address is greater than 32 bits linksts=0x99000001 pcielint=0x02220060 linkcdmsts=0x00000000 (ltssm 0x11=L0)n @AppleT8103PCIeCPort.cpp:1301n Debugger message: panicnMemory ID: 0x6nOS release type: UsernOS version: 22C5033enKernel version: Darwin Kernel Version 22.2.0: Sun Oct 16 18:09:52 PDT 2022; root:xnu-8792.60.32.0.1~11/RELEASE_ARM64_T6000nFileset Kernelcache UUID: D76
Replies
1
Boosts
0
Views
1.2k
Activity
Oct ’22
Reply to Android Studio and Github in M1 processor
Not to they come out with a new android studio does not work with the new android studio.You may have to wait abit.
Topic: App & System Services SubTopic: Hardware Tags:
Replies
Boosts
Views
Activity
Jan ’21
Reply to Chrome Remote Desktop Permission Broken
Same here. on 14.4 beta 3, pops up after every system start. Mac Studio M2 Ultra
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Feb ’24
Reply to Help Positioning Objects around another Object
the cheatiest way to do this would be to use physics.Make spheres with an associated physics body of exactly the same size as themselves, and attempt to place them. If they intersect, take a look at where they're intersecting and make an effort to determine what size fits in the available space.But I don't really understand the auto place the objects around the sphere line, as this seems to be the criteria process for where to place them, but not explicit enough for me to figure out what you're trying to do in terms of placement.I do a LOT of 3D work in 3D apps (design apps) so I have a lot of ways of thinking about how to fit things into 3D space, and describe how to do that, much of which is far hackier and dodgier than the riguour programmers use 😉However... it mostly works, most often.Scene Kit is basically a very primitive 3D app, kind of like using 3D Studio 2, from the early 1990's.But programmable. Which is where I get lost 😀Just let me know how/what you're trying to do, and I'll suggest a
Topic: Graphics & Games SubTopic: SceneKit Tags:
Replies
Boosts
Views
Activity
Oct ’15
Reply to RealityKit's ARView raycast returns nothing
Use Debug Visualizations in Xcode to confirm the collision shapes are correct. Click the small Debug Visualizations icon (to the right of the step out icon in the debug tools) and check the collision shapes option. If the shapes look incorrect, try to generate the shapes using generateStaticMesh(from:) instead of generateCollisionShapes.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jun ’24