Post not yet marked as solved
4
Views
Hello! after a lot of hours of research for a solution, i decide post here my problem:
The app crash after splash art showing, and the console say:
Bootstrapping failed for <FBApplicationProcess: 0x144c762e0; application<app.id>:<invalid>> with error: <NSError: 0x2808737e0; domain: RBSRequestErrorDomain; code: 5; reason: "Launch failed."> {
underlyingError = <NSError: 0x280871740; domain: NSPOSIXErrorDomain; code: 88>;
}
I have Xcode 12.4, i have Automatically manage signing enabled. The IPA file is exported as App Store Connect version.
I have an Identifier created. The Profile is created too.
The app work perfectly if exported as Development, install and run on multiple iPhones.
Post not yet marked as solved
1
Views
Here is my ObservableObject Class
I want to get a category based on which I will initialize the source with a trimmed list:
//UsersViewModel.swift
class UsersViewModel: ObservableObject {
@Published var users: [User]
@Published var category: String
init(category: String) {
self.category = category
self.users = UserData().getUsers(byCategory: category)
}
}
This is my View
//UserListByCategory.swift
import SwiftUI
struct UserListByCategory: View {
@EnvironmentObject var ud: UsersViewModel
var body: some View {
Text("Hello")
}
}
struct UserListByCategory_Previews: PreviewProvider {
static var previews: some View {
UserListByCategory()
.environmentObject(UsersViewModel(category: "Office"))
}
}
I'm hardcoding the "Office" here, but is there a way to pass this to the Class?
Post not yet marked as solved
11
Views
I'm using this code to create a rectangle (that will eventually be a more complex shape):
let vertices = [simd_float3(x: 1, y: 1, z: 0), simd_float3(x: 1, y: -1, z: 0), simd_float3(x: -1, y: -1, z: 0), simd_float3(x: -1, y: 1, z: 0)]
let vertexSource = SCNGeometrySource(data: Data(bytes: vertices, count: MemoryLayout<simd_float3>.size * vertices.count), semantic: .vertex, vectorCount: vertices.count, usesFloatComponents: true, componentsPerVector: 3, bytesPerComponent: MemoryLayout<Float>.size, dataOffset: 0, dataStride: MemoryLayout<simd_float3>.stride)
let indices: [Int32] = Array(0..<Int32(vertices.count))
let element = SCNGeometryElement(data: Data(bytes: indices, count: MemoryLayout<Int32>.size * indices.count), primitiveType: .polygon, primitiveCount: 1, bytesPerIndex: MemoryLayout<Int32>.size)
let geometry = SCNGeometry(sources: [vertexSource], elements: [element])
which logs this error in the Xcode console:
[SceneKit] Error: SCNGeometryElement initialization - Invalid polygon edge count (0)
There also doesn't seem to be any documentation about how to use this .polygon mode. When using .triangleStrip with a primitiveCount of 2, no error is logged.
Post not yet marked as solved
11
Views
Hello all,
Our app's product is a time based limited duration offering and we want to offer a monthly subscription for 4 months or an up-front payment for the 4 months. The up-front payment is easy to add (thinking non-renewing subscription) but I can't seem to automatically cancel a renewing subscription after 4 months of the user paying.
In general hoping for a some way to set a schedule to end or stop the subscription or if we can cancel subscriptions after the 4 months by making calls to an API or something. Alternatively, some sort of 4 month long financing/pay over the course of 4 months would work similarly.
Post not yet marked as solved
14
Views
I am trying t instantiate my arview controller from appdelegate. Code is as below.
func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?)
-> Bool {
let storyboard = UIStoryboard(name: "Main", bundle: nil)
let initialViewController = storyboard.instantiateViewController(withIdentifier: "StartViewMain") as! ViewController
self.window?.rootViewController = initialViewController
self.window?.makeKeyAndVisible()
return true
}
This doesn't give me the ARview. However if I comment the code in appdelegate for instantiation and make the Main.storyboard as an Initial View Controller it works fine. What could be the problem with this code. The end objective I have to be able to show the ARview controller on need. Below is the code for the viewController,swift file Thanks
/* See LICENSE folder for this sample’s licensing information.
Abstract: The sample app's main view controller. */
import UIKit import RealityKit import ARKit import Combine
class ViewController: UIViewController, ARSessionDelegate {
@IBOutlet var arView: ARView!
// The 3D character to display.
var character: BodyTrackedEntity?
let characterOffset: SIMD3<Float> = [-1.0, 0, 0] // Offset the character by one meter to the left
let characterAnchor = AnchorEntity()
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
arView.session.delegate = self
arView.debugOptions.insert(.showStatistics)
// If the iOS device doesn't support body tracking, raise a developer error for
// this unhandled case.
guard ARBodyTrackingConfiguration.isSupported else {
fatalError("This feature is only supported on devices with an A12 chip")
}
// Run a body tracking configration.
let configuration = ARBodyTrackingConfiguration()
arView.session.run(configuration)
arView.scene.addAnchor(characterAnchor)
// Asynchronously load the 3D character.
var cancellable: AnyCancellable? = nil
cancellable = Entity.loadBodyTrackedAsync(named: "character/robot").sink(
receiveCompletion: { completion in
if case let .failure(error) = completion {
print("Error: Unable to load model: \(error.localizedDescription)")
}
cancellable?.cancel()
}, receiveValue: { (character: Entity) in
if let character = character as? BodyTrackedEntity {
// Scale the character to human size
character.scale = [1.0, 1.0, 1.0]
self.character = character
cancellable?.cancel()
} else {
print("Error: Unable to load model as BodyTrackedEntity")
}
})
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
for anchor in anchors {
guard let bodyAnchor = anchor as? ARBodyAnchor else { continue }
// Update the position of the character anchor's position.
let bodyPosition = simd_make_float3(bodyAnchor.transform.columns.3)
characterAnchor.position = bodyPosition + characterOffset
// Also copy over the rotation of the body anchor, because the skeleton's pose
// in the world is relative to the body anchor's rotation.
characterAnchor.orientation = Transform(matrix: bodyAnchor.transform).rotation
if let character = character, character.parent == nil {
// Attach the character to its anchor as soon as
// 1. the body anchor was detected and
// 2. the character was loaded.
characterAnchor.addChild(character)
}
//////////////////////////////////////////////////////////
///Anchor transform in 3D
///
// Access the position of the root node.
let hipWorldPosition = bodyAnchor.transform
// Access the skeleton geometry.
let skeleton = bodyAnchor.skeleton
// Access the list of transforms of all joints relative to the root.
let jointTransforms = skeleton.jointModelTransforms
// Iterate over all the joints.
for (i, jointTransform) in jointTransforms.enumerated() {
// Extract parent index from definition.
let parentIndex = skeleton.definition.parentIndices[i]
// Check if it's not root (the root doesn't have a parent).
guard parentIndex != -1 else { continue }
// Find position of parent joint
let parentJointTransform = jointTransforms[parentIndex]
// Use this however you want...
}
}
}
Post not yet marked as solved
11
Views
My app has a utility window that can live anywhere on screen, including over the menu bar, or in the top region of a screen with a notch when the main screen is not the built-in display.
If I drag the window into these areas, it sits there just fine. I have drag handlers in a subclass of NSWindow that call -setFrame: (yeah, this is Obj-C code).
If the screen gets reconfigured, my code tries to remember where the window was, and calls -setFrame: to put it back there. But in this case, macOS moves my window down out of the menu bar/notch area at the top of the screen.
Is there any way to prevent this behavior?
Post not yet marked as solved
13
Views
Xcode 13.2
Using CNContact seeing the error in the log:
...
[framework] Resetting database at nil path is not allowed
....
What does this mean? What causes this to happen?
How do I avoid this?
Any help appreciated!
Steve
Post not yet marked as solved
14
Views
My jenkins build process uses 'xcrun altool --upload-app' to upload builds for test flight which all worked well until I switched to Xcode 13. In Xcode 13 the app was uploaded but instead of using the CFBundleVersion from the app as the build number is generated its own. So now the latest build has build numbers 7 rather than the correct build number which should look like 4.5.12345.
I have since updated my scripts to use 'xcrun altool --upload-package' using --bundle-version to set the build number but it will not allow me to upload anything because it sees build number 7 as being higher than 4.5.12346.
I have marked the builds with incorrect build numbers as rejected but that doesn't help.
Is there any way to get around this?
Updating to a new app version is not really a solution.
Post not yet marked as solved
22
Views
Hi
I'm developing a captive portal with cookies, it works fine in devices like Macs, laptops and android devices but cookies don't work on iPhone.
I'm creating cookies in js with document.cookie but the problem is when I connect my iPhone and close CNA, it destroy them and next time I try to connect, cookies don't exist anymore.
Someone can help me to figure out how can I avoid CNA destroy my cookies?
Post not yet marked as solved
39
Views
When running the same code on my m1 Mac with tensorflow-metal vs in a google collab I see a problem with results.
The code: https://colab.research.google.com/drive/13GzSfToUvmmGHaROS-sGCu9mY1n_2FYf?usp=sharing
import tensorflow as tf
import numpy as np
import pandas as pd
# Setup model
input_shape = (10, 5)
model_tst = tf.keras.Sequential()
model_tst.add(tf.keras.Input(shape=input_shape))
model_tst.add(tf.keras.layers.LSTM(100, return_sequences=True))
model_tst.add(tf.keras.layers.Dense(2, activation="sigmoid"))
model_tst.summary()
optimizer = tf.keras.optimizers.Adam(learning_rate=0.01)
loss = tf.keras.losses.BinaryCrossentropy(from_logits=False)
model_tst.compile(
loss=loss,
optimizer=optimizer,
# metrics=[tf.keras.metrics.BinaryCrossentropy()
metrics=["mse"
]
)
# Generate step data
random_input = np.ones((11, 10, 5))
random_input[:, 8:, :] = 99
# Predictions
random_output2 = model_tst.predict(random_input, batch_size=1)[0, :, :].reshape(10, 2)
random_output3 = model_tst.predict(random_input, batch_size=10)[0, :, :].reshape(10, 2)
# Compare results
diff2 = random_output3 - random_output2
pd.DataFrame(diff2).T
Output on Mac:
Output on google collab:
If I reduce the number of nodes in the LSTM I can get the problem to disappear:
import tensorflow as tf
import numpy as np
import pandas as pd
# Setup model
input_shape = (10, 5)
model_tst = tf.keras.Sequential()
model_tst.add(tf.keras.Input(shape=input_shape))
model_tst.add(tf.keras.layers.LSTM(2, return_sequences=True))
model_tst.add(tf.keras.layers.Dense(2, activation="sigmoid"))
model_tst.summary()
optimizer = tf.keras.optimizers.Adam(learning_rate=0.01)
loss = tf.keras.losses.BinaryCrossentropy(from_logits=False)
model_tst.compile(
loss=loss,
optimizer=optimizer,
# metrics=[tf.keras.metrics.BinaryCrossentropy()
metrics=["mse"
]
)
# Generate step data
random_input = np.ones((11, 10, 5))
random_input[:, 8:, :] = 99
# Predictions
random_output2 = model_tst.predict(random_input, batch_size=1)[0, :, :].reshape(10, 2)
random_output3 = model_tst.predict(random_input, batch_size=10)[0, :, :].reshape(10, 2)
# Compare results
diff2 = random_output3 - random_output2
pd.DataFrame(diff2).T
-> outputs are the same in this case.
I guess this has to do with how calculations are getting passed to Apple silicon.
Any debugging steps I should try to result this problem?
Info:
I setup tensor flow using the following steps: https://developer.apple.com/metal/tensorflow-plugin/
When running I get this output showing that the GPU plugins are being used
Post not yet marked as solved
14
Views
Hi!
I try to codesign a file inside a macOS executor on CircleCI. I am able to import the certificate inside my new keychain but when I execute the codesign command is stuck, it never returns. No output, nothing. I have to kill the command.
Anyone has an idea? Thanks!
JF
security create-keychain -p default MyKeychain.keychain
echo $APPLE_CERT_DEV_ID_APP_BASE64 | base64 -D -o DevIdApplication.p12
security import ./DevIdApplication.p12 -x -t agg -k MyKeychain.keychain -A -P "$APPLE_CERT_PASSWORD"
security default-keychain -d user -s MyKeychain.keychain
security unlock-keychain -p default MyKeychain.keychain
security set-keychain-settings MyKeychain.keychain
security find-identity -p codesigning
touch file
codesign --timestamp --options runtime -s "Developer ID Application: XXXXXXX inc. (XXXXXXXXXX)" -v file
<< the command hangs here and nothing happens >>
Post not yet marked as solved
11
Views
I'd seen something that suggested that AVAssetWriter could write ProRes in MXF files, likely as part of MTRegisterProfessionalVideoWorkflowFormatReaders() and/or VTRegisterProfessionalVideoWOrkflowVideoDecoders/Encoders. All of this is undocumented. Is it possible to use AssetReader/Writer to handle MXF-wrapped ProRes? (without writing MXF handlers from scratch).
Post not yet marked as solved
11
Views
Without having to use @FetchRequest at the View level has any been able to successfully implement a CoreData Stack with observability in a model with the model being used at the View level. The view model sits between the View & Coredata Stack
CoreData Stack
ViewModel (ObservableObject)
SwiftUI View
Post not yet marked as solved
14
Views
Hi there, I wonder what are some common reasons for the Order ID to be in valid... (Other than fake, randomly generated Order IDs, of course.)
I have a customer sending me a receipt of a non-consumable item, but when I tried to look up the order using https://api.storekit.itunes.apple.com/inApps/v1/lookup/{order-id}, the response code is 1, indicating the Order ID is invalid.
I was able to use the same method to fetch one of my own purchases (because what indie app dev doesn't buy their own stuff?) so I believe the method to look up by Order ID is tried and true.
It doesn't seem like the user was faking a screenshot of the order. On the receipt screenshot, the customer purchased the item using Store Credits. Is it likely that the transaction was later marked as fraudulent by Apple, and therefore invalid?
Just seeing if anyone has info on this.
Thanks!