Posts

Post marked as solved
2 Replies
1.4k Views
I'm using MTKTextureLoader to try and load a texture it works on iPads, but not on iPhone where I get the error : Error Domain=MTKTextureLoaderErrorDomain Code=0 "Image decoding failed" UserInfo={NSLocalizedDescription=Image decoding failed, MTKTextureLoaderErrorKey=Image decoding failedI'd like to know why, how to detect it, and how to work around it.im working with devices using iOS 11 and 12, but the version doesn't seem to make a difference, just iPad/iPhoneSetup to hand right now:iPhone X: iOS 12.1.3 , A11 GPUiPad : iOS 11.4.1, A9 GPUThe texture is JPG format, size 8192 x 8192, and when it loads, it has the pixelformat MTLPixelFormatBGRA8Unorm_sRGBThe code to load it is: NSError *textureError; //self.texture =[textureLoader newTextureWithCGImage:image.CGImage options:textureLoadingOptions error:&textureError]; self.texture = [textureLoader newTextureWithContentsOfURL:textureURL options:textureLoadingOptions error:&textureError]; if (textureError) { NSLog(@"Texture failed to load %@ with error : %@", image, textureError); return NO; }The commented out code was checking loading the texture from an UIImage, the result is the same.Thinking it was the feature set, I checked the devices supported all the iOS 11 feature sets, the iPhone support all of them, the iPad supports all but MTLFeatureSet_iOS_GPUFamily4_v1 . I was expecting the iPhone to not support one.I saw this question which has the same error: https://forums.developer.apple.com/thread/97218I haven't tried the solution yet but it's my next step, but it won't be a very useful solution if I can't detect when I have to use it.
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.3k Views
I've been trying to render some point clouds for the last few days, and I've gone through various examples and I'm pretty sure I got things right but nothing seems to work as of iOS 11 I think, or at least all the examples I find are around 2 years old and nothing after that. I've become convinced it's just broken, as github project and stack overflow questions that were at one point correct no longer work.This is a quick and dirty example in a view controller, pretty much the same as other examples. The scenekitview property just ties to a scnview. I'm trying to render in metal, but opengl doesn't work either.#import #import "ViewController.h" @interface ViewController () @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view, typically from a nib. SCNScene *scene = [SCNScene new]; SCNNode *cameraNode = [SCNNode new]; cameraNode.camera = [SCNCamera new]; cameraNode.camera.zNear = 0.0; cameraNode.camera.zFar = 40.0; [scene.rootNode addChildNode:cameraNode]; cameraNode.position = SCNVector3Make(6.0, 1.5, 15.0); [cameraNode lookAt:SCNVector3Make(6.0, 1.5, 0.5)]; [scene.rootNode addChildNode:[self pointCloudNode]]; self.sceneKitView.showsStatistics = YES; self.sceneKitView.scene = scene; self.sceneKitView.pointOfView = cameraNode; } - (NSData*)pointCloudData { NSMutableData *data = [NSMutableData data]; float values[] = { 12.0, 2.0, 1.0, 1.0, 0.0, 0.5, 11.0, 1.9, 0.9, 0.9, 0.1, 0.4, 10.0, 1.8, 0.8, 0.8, 0.2, 0.3, 9.0, 1.7, 0.7, 0.7, 0.3, 0.4, 8.0, 1.6, 0.6, 0.6, 0.4, 0.5, 7.0, 1.5, 0.5, 0.5, 0.5, 0.6, 6.0, 1.4, 0.4, 0.4, 0.6, 0.6, 5.0, 1.3, 0.3, 0.3, 0.7, 0.7, 4.0, 1.2, 0.2, 0.2, 0.8, 0.8, 3.0, 1.1, 0.1, 0.1, 0.9, 0.9, 2.0, 1.0, 0.0, 0.0, 1.0, 1.0 }; for (int i = 0; i < 11 * 6; i = i + 6) { PointCloudStruct pcs; pcs.x = values[i]; pcs.y = values[i+1]; pcs.z = values[i+2]; pcs.r = values[i+3]; pcs.g = values[i+4]; pcs.b = values[i+5]; [data appendBytes:&pcs length:sizeof(pcs)]; } return data; } - (SCNNode *)pointCloudNode { NSData *data = [self pointCloudData]; SCNGeometrySource *vertices = [SCNGeometrySource geometrySourceWithData:data semantic:SCNGeometrySourceSemanticVertex vectorCount:11 floatComponents:YES componentsPerVector:3 bytesPerComponent:sizeof(float) dataOffset:offsetof(PointCloudStruct, x) dataStride:sizeof(PointCloudStruct)]; SCNGeometrySource *colours = [SCNGeometrySource geometrySourceWithData:data semantic:SCNGeometrySourceSemanticColor vectorCount:11 floatComponents:YES componentsPerVector:3 bytesPerComponent:sizeof(float) dataOffset:offsetof(PointCloudStruct, r) dataStride:sizeof(PointCloudStruct)]; SCNGeometryElement *elements = [SCNGeometryElement geometryElementWithData:nil primitiveType:SCNGeometryPrimitiveTypePoint primitiveCount:11 bytesPerIndex:sizeof(int)]; elements.pointSize = 1.0; elements.minimumPointScreenSpaceRadius = 1.0; elements.maximumPointScreenSpaceRadius = 5.0; SCNGeometry *geometry = [SCNGeometry geometryWithSources:@[vertices, colours] elements:@[elements]]; return [SCNNode nodeWithGeometry:geometry]; } @endI've tried adding shaders using SCNProgram on the SCNGeometry but that doesn't seem to do anything. Setting the geometry element data didn't help either.One unusual thing, when showing the stats on the scnview, it shows over 2.7k vertices, that can't be right....Most of the code is straight ported to obj-c from here https://github.com/eugeneu/PoindCloudRendererThere are other examples too around.Is there something obvious (or not so obvious) wrong that I've missed?
Posted Last updated
.
Post marked as solved
5 Replies
1.3k Views
This worked before the latest XCode 9 beta 2. I have to presume it's something to do with the opengl performance problems there's been with the simulator.Now it seems to be called sometimes, but something happens, and then it's not working. I haven't yet tracked down the something.
Posted Last updated
.
Post marked as solved
6 Replies
3.2k Views
I store colours in my core data model using transformable attributes. Starting iOS11, this no longer works and I get an error like :... returned error Error Domain=NSCocoaErrorDomain Code=259 "The file couldn’t be opened because it isn’t in the correct format." UserInfo={NSUnderlyingException=Can't read binary data from file, NSUnderlyingError=0x60000005cb00 {Error Domain=NSCocoaErrorDomain Code=259 "The file “<Persistent store name>” couldn’t be opened because it isn’t in the correct format." UserInfo={NSFilePath=<Path to persistent store>, NSUnderlyingException=value for key 'NS.objects' was of unexpected class 'UIColor'. Allowed classes are '{( NSDictionaryMapNode, NSSet, NSDictionary, NSOrderedSet, NSDecimalNumber, NSUUID, NSNumber, NSNull, NSData, NSDate, NSURL, NSString, NSArray )}'.}}} with userInfo dictionary { NSUnderlyingError = "Error Domain=NSCocoaErrorDomain Code=259 \"The file \U201cEmpty.rag\U201d couldn\U2019t be opened because it isn\U2019t in the correct format.\" UserInfo={NSFilePath=<Path to persistent store>, NSUnderlyingException=value for key 'NS.objects' was of unexpected class 'UIColor'. Allowed classes are '{(\n NSDictionaryMapNode,\n NSSet,\n NSDictionary,\n NSOrderedSet,\n NSDecimalNumber,\n NSUUID,\n NSNumber,\n NSNull,\n NSData,\n NSDate,\n NSURL,\n NSString,\n NSArray\n)}'.}"; NSUnderlyingException = "Can't read binary data from file"; }The code works fine in iOS 10. I basically just set the attribute to a transformable type in the object model. And specify UIColor (Or NSColor for OSX) in the class property type.Is there a new step or will this functionality no longer be supported?
Posted Last updated
.
Post not yet marked as solved
1 Replies
489 Views
There used to be documentation on writing custom accessors and setters in core data, the NSManageObject docs mentions them, and links to the Core Data Programming Guide.However, looking through them, I can't see any reference to actually writing them like you used to eg:- (void) setName:(NSString *)name { [self willChangeValueForKey:@"name"]; [self setPrimitiveName:name]; [self didChangeValueForKey:@"name"]; }And you'd have to declare the setPrimitiveXXXX property in the interface, leading to method definition warnings too.Is that functionality being phased out? Or an ommision on the docs?I'm actually going through some old code and trying to remove them where I can in favour of some cleaner code, but sometimes it's just easier.
Posted Last updated
.
Post not yet marked as solved
1 Replies
438 Views
So the iOS 11 release date draws ever nearer.I've found bug in core data which means my app will not work in iOS 11 with no workaround. Core Data simply refuses to open the previously supported binary datastore. It's not an error in my code, I'm using core data in a way that's supported. The bug only happens with one type of datastore, but works fine in others.Core data reports it as an unsupported attribute when opening a database, but has not problem storing it as the unsupported value.So when a user upgrades to iOS11, they can't open or create any files.I can't migrate the data in iOS 11 because I can't open the file to fix it.I've submitted a radar bug report (33895450), but Apple hasn't commented on it, for all I know, no one has even looked at it. It's got a super simple xcode project attached with full details on how to replicate the problem, which happens every single time.So come iOS 11 release day, my app is hosed. I can't work on any other iOS 11 issues because my app won't open in iOS 11.I can't find any way to escalate the issue. Apple seems to have dropped the priotirisation, but this is data loss and application crash.I looked at requesting technical support, but when I go there, they specifically state they won't work with pre release software.I'll put some links in replies, because otherwise Apple will moderate this question and I'll lose even more days on help.
Posted Last updated
.
Post marked as solved
19 Replies
22k Views
On my ipad with the beta installed, my storage keeps growing as "Other". Looking in Storage, it's being taken up as "System", currently at 17GB. If i unload apps or delete data, system to just grows to fill the space. The only way to fix it is to backup and restore, but then the problem returns. I can use it for a week or two. Causes all sorts of problems, i can't even start feedback to submit a problem to apple with all the diagnostic data.
Posted Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
I want to get the corner coordinates of a plane that ARKit has identified into the ARSCNView screen coordinates.I've started by adapting Apple's example for creating the planes, so I have a Plane class: class Plane : SCNNode { var anchor : ARPlaneAnchor init(anchor : ARPlaneAnchor) { self.anchor = anchor super.init() self.geometry = SCNPlane(width: CGFloat(Float(anchor.extent.x)), height: CGFloat(anchor.extent.z)) self.rotation = SCNVector4Make(1.0, 0.0, 0.0, -Float.pi/2.0) let material = SCNMaterial() material.diffuse.contents = UIColor.yellow material.specular.contents = UIColor.white self.geometry?.firstMaterial = material } required init?(coder aDecoder: NSCoder) { fatalError("Error") } func update(anchor: ARPlaneAnchor) { self.anchor = anchor if let plane = self.geometry as? SCNPlane { plane.width = CGFloat(anchor.extent.x) plane.height = CGFloat(anchor.extent.z) } } }And the ARSCNViewDelegate methods in the view controller to identify and update it: func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { if let planeAnchor = anchor as? ARPlaneAnchor { let plane = Plane(anchor: planeAnchor) node.addChildNode(plane) planes[planeAnchor] = plane print("plane = \(plane)") } } func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) { if let planeAnchor = anchor as? ARPlaneAnchor { planes[planeAnchor]?.update(anchor: planeAnchor) } } func renderer(_ renderer: SCNSceneRenderer, didRemove node: SCNNode, for anchor: ARAnchor) { if let planeAnchor = anchor as? ARPlaneAnchor { planes.removeValue(forKey: planeAnchor) } }As I understand it, the plane has it's own local coordinate space in scene kit, so I want to to just create 4 SCNVector3 values, and convert them from the Plane's local space to the scene's world coordinate space, and from them, project them to the screen's space.This is how I'm doing it: func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) { for (_, aPlane) in planes { // Vectors representing the corners of the plane let maxX = aPlane.anchor.extent.x/2.0 let minX = -maxX let minY = aPlane.anchor.extent.z/2.0 let maxY = -minY var point1 = SCNVector3Make(minX, 0.0, minY) var point2 = SCNVector3Make(maxX, 0.0, minY) var point3 = SCNVector3Make(maxX, 0.0, maxY) var point4 = SCNVector3Make(minX, 0.0, maxY) // var point1 = SCNVector3Make(minX, minY, 0.0) // var point2 = SCNVector3Make(maxX, minY, 0.0) // var point3 = SCNVector3Make(maxX, maxY, 0.0) // var point4 = SCNVector3Make(minX, maxY, 0.0) print("initial nodes ------------------") print("point1 = \(point1)") print("point2 = \(point2)") print("point3 = \(point3)") print("point4 = \(point4)") // Get the root node and convert the coords from the plane's coord system let rootNode = sceneView.scene.rootNode point1 = rootNode.convertVector(point1, from: aPlane) point2 = rootNode.convertVector(point2, from: aPlane) point3 = rootNode.convertVector(point3, from: aPlane) point4 = rootNode.convertVector(point4, from: aPlane) print("first conversion nodes ------------------") print("point1 = \(point1)") print("point2 = \(point2)") print("point3 = \(point3)") print("point4 = \(point4)") // Finally, project them to the screen view point1 = sceneView.projectPoint(point1) point2 = sceneView.projectPoint(point2) point3 = sceneView.projectPoint(point3) point4 = sceneView.projectPoint(point4) print("final conversion nodes ------------------") print("point1 = \(point1)") print("point2 = \(point2)") print("point3 = \(point3)") print("point4 = \(point4)") } }However, whatever I do, when I have a plane in full view, I still get coordinates outside the bounds of the view. I'm pretty sure it's something simple, but I've tried many combinations now
Posted Last updated
.