HiI'm using Swift 4 and the AvCam example code from Apple. In my own app I've an issue where the captured image orientation is wrong. My requirement is to save the photos to my applications folder as opposed to the Photos library. I've spent a day trying to correct my own code and got nowhere.As a test I modified the AvCam code to write the photo data to a static variable in a struct and then display that image on the camera view in AvCam by tapping a button as follows:// In the camera view controller: @IBOutlet weak var pv: UIImageView! // for previewing test @IBAction func btn(_ sender: UIButton) { // for setting preview test image let dataProvider = CGDataProvider(data: photoTest.photo! as CFData) let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) let photo = UIImage(cgImage: cgImageRef) pv.contentMode = .scaleAspectFill pv.image = photo }The orientaion issue is replicated in AvCam with the above code. In my own app the image is previewed before saving to the apps folder and the images orientation is always wrong in the preview as well as when reloading the image from disk. I.e. if the iphone is in portrait orientation, the preview image is rotated 90 degrees anti-clockwise, in landscape the same happens so the preview image appears to be the correct way up.I'm assuming I've completely missed the point somewhere along the line as it seems to me that regardless of device orientation, the top of a photo should always be the top. For example, in the Photos app photos are shown the correct way up regardless of orientation.Any help would be much appreciated, this is driving me nuts ;o)Cheers
Post not yet marked as solved
HiI’ve just updated my Xcode and now the simulator isn’t simulating location services. It worked a few times, but not every time, and now it doesn’t work at all. The app has permission etc. but no matter what I do it keeps crashing when trying to run code against location manager. The first few times, if I stopped the app the simulator would set location back to none and I’d have to set it back to apple or whatever. Tried rebooting etc. and now it’s not working at all. I saw a post where people said they could get it working by switching between the various location options in the simulator a few times, tried that more than a dozen times and nothing.Is this a known bug perhaps? Does anyone know a fix? It’s a major pain having to test on a physical device...CheersMark
Post not yet marked as solved
HiI’ve sort of asked this before but got no response. I’ve been on this now for two days and am getting nowhere.Basically, as a quick test, I added some code to capture the photoData just before the ”didFinishCaptureFor” func writes the image to the Photos library. Then a button on the camera VC that loads the data into an image view (pv) on the same VC as follows: let dataProvider = CGDataProvider(data: photoTest.photo! as CFData) let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) let photo = UIImage(cgImage: cgImageRef) pv.contentMode = .scaleAspectFill pv.image = photoThe image that is saved to the Photos library is correctly orientated, while the one previewed is not.I understand about how the image is captured etc., and have read so many posts that explain why, but I can’t find an example that works where the image is either previewed as above, or saved to the app folder as opposed to the Photos library.This post https://forums.developer.apple.com/thread/102846 has a response by bford that states “When you write the buffer to any of the standard still image containers (JPEG, HEIC, DNG, TIFF, etc), the image is rotated and/or flipped appropriately at display time.”.If this is the case then I’m clearly missing some code as I’ve not modified the AvCode, simply added a preview, yet the image is still not displayed properly.Note that the only reason I need custom camera code is that I need to record the location and direction the camera was pointed when the photo was taken. The image picker doesn’t return location data and ultimately I don’t want to have to click “use photo”, rather just save every one.Thanks in advance
Post not yet marked as solved
HiFirst: Appollogies, I also posted this in the AVFoundation (video and camera) forum but then I noticed there is very little activity there and even fewer responses...I'm using Swift 4 and the AvCam example code from Apple. In my own app I've an issue where the captured image orientation is wrong. My requirement is to save the photos to my applications folder as opposed to the Photos library. I've spent a day trying to correct my own code and got nowhere.As a test I modified the AvCam code to write the photo data to a static variable in a struct and then display that image on the camera view in AvCam by tapping a button as follows:// In the camera view controller: @IBOutlet weak var pv: UIImageView! // for previewing test @IBAction func btn(_ sender: UIButton) { // for setting preview test image let dataProvider = CGDataProvider(data: photoTest.photo! as CFData) let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) let photo = UIImage(cgImage: cgImageRef) pv.contentMode = .scaleAspectFill pv.image = photo }The orientaion issue is replicated in AvCam with the above code. In my own app the image is previewed before saving to the apps folder and the images orientation is always wrong in the preview as well as when reloading the image from disk. I.e. if the iphone is in portrait orientation, the preview image is rotated 90 degrees anti-clockwise, in landscape the same happens so the preview image appears to be the correct way up.I'm assuming I've completely missed the point somewhere along the line as it seems to me that regardless of device orientation, the top of a photo should always be the top. For example, in the Photos app photos are shown the correct way up regardless of orientation.Any help would be much appreciated, this is driving me nuts ;o)Cheers
Afternoon...I've not been on here for a while, but today I've noticed this forum is pretty much unusable in iOS, on iPhone or iPad.The header block is transparent and covers the top of the screen so you get words overlapping words.I was getting some sort of "forbidden" error trying to post on my iphone.If I scroll down on my iPad the screen starts jumping about as if I was dragging my finger all over the place.Seems fine on my MacBook Pro though.Cheers
Afternoon AllI have a collection view with a simple custom layout (each cell is a simple image) and am getting errors when tryig to delete items, i.e. "assertion failure in -[UICollectionViewData validateLayoutInRect]" and then also "UICollectionView received layout attributes for an index ath that does not exist". The second error crashes the process.I've created a very simple test, without custom layout, as follows: // This works items.remove(at: 0) cv.deleteItems(at: [IndexPath(row: 0, section: 0)]) cv.reloadData()This works fine. I've tried at least a dozen solutions I found online but nothing works. I've tried batch updates, setting delegates to nil and then resetting after the updates, invalidating the layout etc.The error is almost always on the line "cv.deleteItems(at: [IndexPath(row: 0, section: 0)])", sometimes in the app delegate.Does anyone have a simple checklist of what I'd need to do to get this working with custom layout?Many thanks...
Post not yet marked as solved
HiI’m working on a map based app for doing geographical surveys which will be used in locations where internet connectivity doesn’t exist. This is my first app so I’m learning as I go.I understand that I can take snapshots of a map (while on-line of course) and am wondering if it’s feasible/allowed to save these images and use them later as overlays when there is no data connection. My questions then are:a) is this a valid approach?b) when capturing the map image would the region details be available so it can be used as a tile/overlay?I‘ve discussed using open street map tiles with a non-developer friend who has sveral years experience with them as a user and have been told they can be problematic at best. Also, I’d very much like to avoid using third party functionality.Thanks very muchMark
Post not yet marked as solved
HII‘ve been playing about with instruments and the memory graph, basically testing progress on my first app, and I’m completely confused.My app has several view controllers, uses images, maps and core data etc., and I‘m almost certain I don’t have any retain cycles and I’ve checked all my classes and view controllers are deinitialising, as well as removing map delegate and the map from its super view. I’m also using capture lists in closures, setting variables to nil and purging arrays etc. when no longer needed.However, if I profile the app in instruments, looking for leaks, it tells me I have maybe a dozen or so. If I run through the same processes in the app using the debug memory graph, no leaks or other runtime issues are reported. Memory usage according to Xcode peaks at around 120mb and settles back down to about 70mb. In instruments, although leaks are reported drilling down through the detail and looking at the related source code isn’t telling me anything that helps. My questions then are:a) are there a lot of false positives, relating to leaks, in instrument?b) does anyone have a link relating to finding memory leaks with instruments that doesn’t simply create a retain cycle that could be spotted from a hundred miles away? I.e. something real world. I’m struggling to relate the massive amount of information in instruments to actual issues.Thanks very much :-]
HiI‘ve been trying to wrap my head round memory management etc. and noticed that simply raising a UIAlert (with a simple text field, save and cancel buttons) bumps up my memory usage by about 12 to 14mb. The memory doesn’t appear to get released, although it doesn’t keep increasing either.Is this normal? I know it’s not much, just curious, seems a lot for such a simple tool.Cheers
HiI’m building a geo survey app that I want the user to be able to set the user tracking mode to track with heading and allow the user to zoom in and out without the tracking mode changing. I.e. in use the user would be able to point the phone at a hill for example, then zoom to a level that displays the hill on the map. The problem is that when the map is zoomed the tracking mode changes back to the default and the map no longer rotates as the phones heading is changed. Its so frustrating it renders track with heading almost pointless in this context.any ideas much appreciated Cheers
Post not yet marked as solved
HiI'm new to core data and am having numerous issues. As I'm new to Swift etc. I thought I'd work through a tutorial and see what is different to my project (a simple tracker using core data and core locations).Looking at the Moonrunner tutorial on raywenderlich.com I notice, and it may well be a red herring, that on the tutorial if you click an entity in the data model, in the Quick Help section on the right if I click the "run" entity its states: "Declaration public class run: NSManagedObject Declared in Run+CoreDataClass.swift"If I click an entity in my project it simply states: "No Quick Help"Given I'm getting very random errors, e.g. entities not being recognized (when declaring a variable against them), but still te code compiles and runs, I'm wondering if I've missed something fundamental or have somehow got a corrupt XCode install or something. Note that I created a new project and data model just in case, and it still doesn't have the Quick Help I seen in the RW tutorial.Thanks in advance for any pointers...
Post not yet marked as solved
HiI've read in several places that the default coredata code produce by xcode, when creating a new project, shouldn't really be in the app delegate file. I'm new to core data and am finding it difficult to make sense of the many contradictory examples I'm finding on the Internet. In some cases, admittedly they are just examples, the data manipulation and retrieval code is also in the app delegate.Is this correct?Should custom code in the app delegate be limited to calls to other classes? What are the pitfalls of putting functionality in the app delegate? I ask because I shifted the default persistent container and context etc. into a separate class, as when I tried to do a save in the background (using DispatchQueue) I got an error stating something along the line that I couldn't do that against the viewContext as it was part of the app delegate (apologies for not being specific, that was over a week ago).Foolishly I thought I'd do my learning by building what I thought, based on many years experience with VB etc., was a very simple app. Turns out that nothing is that simple. I've hit so many walls I think that my approach in many places is fundamentally wrong. I don't learn well reading manuals, rather I prefer working examples etc.If anyone has an example of, or a link to, code that demonstrates correctly how to work with core data with relationships etc. it'd be much appreciated.In particular how to:a) not lock up the UI (I have potentially thousands of related records).b) know when all data for a relationship has been retrieved (I'm getting an error now stating an array populated from NSMutableSet.allObjects mutated while being iterated)Thanks in advance...Ps. My code is written in Swift.
Post not yet marked as solved
HiI'm getting into core data (slowly) and am a little confused as to how I should be saving the data.I've looked at various examples and they all use persitantContainer.viewContext, and in several cases that is defined in the appDelegate.swift file.The developer documentation states that viewContext is "The managed object context associated with the main queue (read-only)". Yet elsewhere the recommendation is that data saves should be on a background thread/queue to avoid interrupting the UI.I'm pretty sure I've missed something here... Am I perhaps just confusing threads and queues perhaps?Cheers
Post not yet marked as solved
HII am doing a feasibility study for a risk register database and am fairly new to iOS development and have not used CloudKit before.The register uses core data for local peristance, and the plan is to create a centralised online data store in a way that:1) A client organisation can store and share data with its staff securely (staff having full read/write access).2) There is no need for the client to subscribe to some sort of third party online storage option or set up their own server etc. I.e. it would use iCloud.I’ve just watched a couple of WWDC videos on CloudKit and was wondering if CloudKit sharing could provide an adequate solution. Please shoot me down if there are more suitable alternatives or I’ve misunderstood what can be done with CloudKit, but what I’ve in mind is something along the following lines:1) The organisation has an iCloud account against which the data is stored in iCloud using CoreData (locally) and CloudKit.2) A specific header record (core data, or a CloudKit compatible representation of that record) could be shared to any of its staffs iCloud accounts. Not all records would be shared with all staff.3) Any number of child records could be created against the shared header by the staff remotely on their iPhones/iPads.4) Any data created/edited in step 3 would be synchronised back to the organisations account and would be available to anyone else involved in the share.I appreciate this is a bit of a simplification and I’m not necessarily looking for examples. Rather is this is practical solution and if not, a pointer or two to get me on the right track. Ideally I’m aiming for the simplest solution possible, e.g. the app would be available on the App Store and the only other setup requirement for the organisation would be to create shares with its staff and make sure they had copies of the app on their iPhones etc.Thanks very much...
Post not yet marked as solved
HiI've been trying to get my head round XCode and Swift for the past month or so and seem to be spending more time trying to resolve issues than learning. I've 20+ years experience coding for Windows (shame on me I know ;o) and, being so impressed with Apple hardware since my first iphone (1) and macBook etc., decided to do myself a favour and learn something new. Please tell me I haven't made a horrible mistake.Every day something seems to go wrong that is outside my control. The forums are filled with stories of functionality breaking down due to updates, zero response from Apple support and the like, appeals processes that take weeks or months and apps being rejected for the app store that had previously been fine.Not wanting to give up, can anyone point me at some documentation that simply lists all the things to watch out for?So far I've had issues with:- the data modeller not displying entities or their properties correctly- false error positives compiling code in relation to undeclared types relating to entities that do exist and are referenced elsewhere without issue (and the code ran despite the errors)- plist vapourising and not letting me quit XCode as it can't be autosaved- location services access request dissapearing before it can be touched- XCode crashing out without error- assistant editor opening the wrong file from the storyboard- code in files dissapearing and then reapearing when I restart xcodeEvery other day I seem to get an error relating to something that was working when I shut down the day before and suddenly is not working when I restart.Could I be cursed perhaps, or is this normal? Please note that I tend to jump in at the deep end when it comes to learning so this could all be down to not fully "reading the manual".