It can read the model is s5 but can't read the version of watchOS and the capability of storage.
Anyone knows how to fix this?
Core OS
RSS for tagExplore the core architecture of the operating system, including the kernel, memory management, and process scheduling.
Post
Replies
Boosts
Views
Activity
Is there a maximum distance at which an entity will register a TapGesture()? I'm unable to interact with entities farther than 8 or 9 meters away. The below code generates a series of entities progressively farther away. After about 8 meters, the entities no long respond to tap gestures.
RealityView { content in
var body: some View {
RealityView { content in
for i in 0..<10 {
if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) {
content.add(immersiveContentEntity)
immersiveContentEntity.position = SIMD3<Float>(x: Float(-i*i), y: 0.75, z: Float(-1*i)-3)
}
}
}
.gesture(tap)
}
var tap: some Gesture {
TapGesture()
.targetedToAnyEntity()
.onEnded { value in
AudioServicesPlaySystemSound(1057)
print(value.entity.name)
}
}
}
Issue:
Our app is currently experiencing an unexpected behavior related to VPN functionality on iOS devices. Despite having the "OnDemandUserOverrideDisabled" parameter set to 1 in our VPN profile, users have reported that they can create a shortcut to disable the "Connect On Demand" feature. However, upon doing so, toggling off the VPN does not re-enable the feature as anticipated. This oversight results in unfiltered browsing, potentially compromising user security and privacy.
Explanation:
The presence of "OnDemandUserOverrideDisabled" set to 1 in our VPN profile should theoretically prevent users from toggling the "Connect On Demand" feature via any means. However, users have found a workaround using shortcuts to bypass this safeguard. Consequently, the VPN does not automatically re-engage after being disabled, leading to unintended consequences for users.
Impact:
The inability to reliably control VPN settings, despite profile configurations, poses a significant risk to user data privacy and security. Unintended unfiltered browsing can expose users to malicious actors and compromise sensitive information.
I'm currently working with complication using widgetkit for watchOS.
When I select complication from Watch app in iPhone, The complication does not show content. In complication gallery, untitled complication is selecting. But when I select complication from watch, it's OK.
This bug occurs in both real device and simulator. But it happen in some pair.
Example:
watch ultra (os 10.4) pair with iPhone 14 pro (os 17.0): NG
watch ultra (os 10.4) pair with iPhone 14 pro (os 16.1): NG
watch ultra (os 10.0) pair with iPhone 14 pro (os 17.0): OK
I tried create simple project to check this bug. But this bug still occurs
This is sample project: Github
Good day. I'm inquiring if there is a way to test functionality between Apple Pencil Pro and Apple Vision Pro? I'm trying to work on an idea that would require a tool like the Pencil as an input device. Will there be an SDK for this kind of connectivity?
Steps to reproduce:
Connect iPhone to a Bluetooth keyboard, and enable "Full Keyboard Access" in settings
Got to https://material.angular.io/components/select/examples
Open any dropdown and use keyboard to tab away
Focus moved to the next control and dropdown panel is still open
Expected Behavior:
Dropdown should be collapsed when user tabs away using keyboard
Hi,
I was working with URLSession.upload with background config.
I came across this cancellation reason in URLError.BackgroundTaskCancelledReason.
backgroundUpdatesDisabled
Docs suggest that these are triggered while background tasks are disabled. Does it mean disabled by user?
Can anyone please shed light on how this cancellation reason can occur? Who can disable the background upload (User or the system ...) and how?
We develop alerting Bluetooth devices for special users. We have an alerting system that helps send alerts when important things happen, e.g. phone calls, and SMS. We also want able to relay emergency and government alerts so that users also may be warned. This means we want emergency and government alerts to be transmitted to our Bluetooth devices paired with iPhone, can this be implemented?
Hi,
I have two iOS apps in the same app group. Now App 1 wants to execute a method in App 2 via IPC:
App 1 -> call method test() via IPC on App 2 -> App 2 returns the result to App 1.
Is this possible?
Regards,
Hey,
I have an app that presents a UIWindow on top of an existing window. I've made all the needed calls to makeKeyAndVisible() and I can clearly see the new window with the subviews on the screen.
I've made the subview receive focus by implementing them as custom views that have canBecomeFocused as true
also the window implements preferredFocusEnvironments and returns these subviews
The issue the focus engine doesn't want to move to the bottom view, it seems the first one does get focus.
The error I'm seeing when printing using UIFocusDebugger.checkFocusability():
<UIFocusUpdateContext: 0x600003318000: previouslyFocusedItem=<FV: 0x102a17f30>, nextFocusedItem=(null), focusHeading=Down>
The following issues were found that would prevent this item from being focusable:
- ISSUE: This item is not visible onscreen.
I can't find any reference online about this error, so it seems something really wild is going on and I can't find the reason.
Did anyone encounter this before or knows when this error is being printed?
What causes the focus engine to think the view is not visible when it's all there?
Hello,
I have an iOS app that is recording audio that is working fine on iPads/iPhones. It asks for microphone permission and after that recording works.
I installed the same app on my M3 MacBook via TestFlight since iPad apps are supposed to work without a change that way. The app starts fine and everything, but it never asks for Microphone permission, so I can't record.
Do I need to do something to make this happen (this is not macCatalyst, its running the arm64 iPhone binary on macOS)
thanks
In Xcode Version 15.3 (15E204a) I am getting this warning:
Class AKAlertImageURLProvider is implemented in both /Library/Developer/CoreSimulator/Volumes/iOS_21E213/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS17.4.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/AuthKit.framework/AuthKit (0x123719508)
/Library/Developer/CoreSimulator/Volumes/iOS_21E213/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS17.4.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/AuthKitUI.framework/AuthKitUI
One of the two will be used. Which one is undefined.
and also
objc[2875]: Class AKBiometricRatchetUtility is implemented in both /Library/Developer/CoreSimulator/Volumes/iOS_21E213/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 17.4.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/AuthKit.framework/AuthKit (0x12371ab10) and /Library/Developer/CoreSimulator/Volumes/iOS_21E213/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 17.4.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/AuthKitUI.framework/AuthKitUI (0x152a1d810). One of the two will be used. Which one is undefined.
I'm currently trying to develop a l2cap demo application where I want to send data to a bluetooth chip over l2cap.
I'm able to send several packet from the iPhone to the bluetooth chip. The chip send back credits and the iPhone continue to send data.
After a moment suddenly the outputstream crash and rise a error that I'm not able to interpret.
here is the error raised :
/Users/13fmeyer/Documents/Screenshot 2024-05-07 at 15.15.37.png
Here is the section of the code that manage the write :
/Users/13fmeyer/Documents/Screenshot 2024-05-07 at 15.16.49.png
and the write method :
/Users/13fmeyer/Documents/Screenshot 2024-05-07 at 15.17.20.png
Finally it close the act connection and stop the transmission.
Does somebody have any idea about what happen ?
I am looking for code that computes the Eigenvalues and Eigenvectors using the Accelerate Sparse Matrix library.
import SwiftUI
import TipKit
@main
struct TipKit_WithPresentPageApp: App {
var body: some Scene {
WindowGroup {
ContentView()
.task {
try? Tips.resetDatastore()
try? Tips.configure([
.datastoreLocation(.applicationDefault)
])
}
}
}
}
import SwiftUI
struct ContentView: View {
@State private var isPresented: Bool = false
var body: some View {
NavigationStack {
VStack {
Image(systemName: "globe")
.imageScale(.large)
.foregroundStyle(.tint)
.popoverTip(MyTip())
.padding(100)
Button("Hit Me!") {
isPresented.toggle()
// When the TipKit notification appears, the 'present sheet' button will be non-functional. (iPhone SE and simulator devices)
}
.padding()
.sheet(isPresented: $isPresented) {
PresentPage()
}
}
}
}
}
import SwiftUI
struct PresentPage: View {
var body: some View {
Text("Hello, world again!")
.font(.title)
}
}
import TipKit
struct MyTip: Tip {
var title: Text {
Text("Test")
}
var message: Text? {
Text("Hi")
}
}
When the TipKit notification appears, the 'present sheet' button will be non-functional. (iPhoneSE Landscape Right)
When using the iPhone SE (Landscape Right) or its simulator (iPhone SE Landscape Right), running iOS 17.2. Whenever the TipKit notification is triggered and displayed on the screen, the 'present sheet' button, which is typically used for presenting a new sheet within the app, becomes non-functional.
Device: iPhoneSE iOS 17.2
Does anyone know how to bypass this bug? Thank you.
Ever since I received the latest Beta update I have not been able to use CarPlay via a cord to my car (I've tried multiple cords with the same result). It has been working as expected since I bought the car a few years ago. I've tried multiple other Apple phones and they have no issue, so I know it's not my car's software. I have called apple support and they gave me instructions on how to uninstall the Beta software, which has also not been successful. I entered a ticket for this issue on April 22nd, but haven't heard anything back.
Anyone have suggestions on how I can get my CarPlay to work? or how to uninstall successfully from the Beta ios?
So, we've got a mobile app that is using background processing to occasionally scan for nearby BLE beacons. When running a debug / local build of the app, everything behaves as expected, but if we upload a build to TestFlight and install from there, the background processing doesn't happen.
This behavior would seem to point to a capability in the App ID not being provisioned, but when I look in the Identifiers section of Apple Developer, Background Modes as a capability isn't listed for either the existing Application Identifier or when creating a brand new one. The app has the Background Modes capability assigned if I look at it in XCode.
Any thoughts as to where to look next or what I'm missing?
How can a programs be launched at startup if it is not in Launch Options and Launch Daemons/Agents? Spotify, for example.
I created a PWA that requires access to users' geolocation to perform a certain action in the system. The correct operation would be the user opens the application, and then the operating system prompts them to allow sharing their exact location with the PWA. However, this is not happening with a few users who have iPhone 11 or XR.
I tested it on iPhones 14, 13, 11 Pro, and even iPhone 6, and it works as expected. I directly spoke with a user who was experiencing the problem and conducted some tests.
I checked if location access was allowed in the settings.
I verified if Safari was accepting with the option to always ask selected.
In the settings of my system's website, I checked if location access was allowed with the option to always ask chosen.
We changed all prompting options to allow.
We opened the following site https://whatpwacando.today/ and found that geolocation was also not possible.
Everything indicates that the issue lies with these users' phones; however, other geolocation methods work fine, as other geolocation apps function properly. This leads me to think that it might be a problem with Safari not working properly with the HTML Geolocation API.
I'm not sure if there are any more advanced settings that could help or if anyone else has encountered this issue.
I have a iPad mini4 with iPadOS 15.8.2.
I try to Install iPadOS 17.5 beta 4. But I can't succeed in installing the program.
I read the documents related to iPadOS 17.5 beta 4 in Apple Developer Program.
According to the instructions, I repeated to install the profile to my mini4 by various ways, but in vain. I made inquiries to Apple Care and Apple Developer Support, but they said they gave me some advices that they could have in their resources. They told me my operations seemed to be correct in order to install the profile related to iPadOS 17.5 beta 4 for the machines with iPadOS 16.3 and earlier. They also basically told me they didn't know why and how on a beta program and to go to Developer Forums and post my problem in Developer Forum.
I have also have M2Macbook Pro, iPad9, SE3. "Beta Updates"s appear in Three System Settings but only "Automatic Updates" appears in mini4. It is sure to enter a Developer Mode on my Apple ID.
How should I do in order to install the profile for iPadOS 17.5 beta 4 ?
Please tell me how.