Post not yet marked as solved
Years ago, one could use a window in Xcode to select an app being developed on an iPhone, and see the *.sqlite files that the app used for Core Data. One could copy those files back onto the Mac, and so preserve the current state of the app's data.
I've tried to get back to that window in the current Xcode, but with no joy. Can anyone give me a clue as to how to preserve a Core Data state for an iOS app?
Post not yet marked as solved
I recently got an Apple Watch, and now I have lots of health data on my iPhone. I'd like to get that data onto my Mac, where I can analyze it using statistical packages like "R".
As a kludge, I can run an iPhone app in the debugger to "print" the health data as CSV records to the Xcode output, and then do a "Select All", "Copy" and "Paste" to put the data into Aquamacs, where I can save to a Mac *.csv file and access it using "R"
This is straightforward, but tedious. Does anyone have another suggestion? I've considered using CloudKit or CoreData in iCloud, but both seem like overkill. Is there any way to write a CSV file to iCloud, or to connect to a file URL on my Mac over the local network from the iPhone?
Thanks.
Post not yet marked as solved
I am developing an iOS app (mostly for myself) where I'm entering a lot of data that would be very tedious to re-enter. Right now, my iPad works fine in CoreData, and the corresponding CloudKit database seem accurate. But I am old enough to remember when Apple decided that my nearly-empty iOS Calendar app was "the truth" and wiped out 25,000 entries on OS-X.
So, I would like a strategy for backing up the current state of either CoreData or CloudKit, in prep for trying out my app on the iPhone. If there is a way to write a file on iOS that would survive a recompile, I don't know how to do it. The only idea I have is to print() *.csv formatted records in the Xcode debugger, and cut-pasting it from Xcode to aquamacs. I'd appreciate any comments...
PS -- I haven't written JavaScript in the last decade.
Post not yet marked as solved
I have a UIScrollView taking up about the lower third of the screen on an iPhone 6s. The content view is an UIImageView displaying a typical iPhone camera image 2100x1575. When the app first puts an image in the UIImageView/UIScrollView, it is well zoomed in because the scroll view is only about 400x400. At this point single-finger scroll gestures have no effect. If I do a pinch gesture, the zoom effect "wakes up" and I can manipulate the image, but in a way that confuses me. How far I can scroll in the image seems to be affected by the zoom level; surprisingly (to me) I can roam the zoomed-in image over a wider range than the zoomed-out image.
I tried to make sense of what's going on by putting print statements in the scrollViewDidZoom and scrollViewDidScroll delegate methods, but when I do a pinch gesture, I get both zoom AND scroll callbacks from a "purely" pinch gestures.
Where can I find an explanation that deconvolves the behavior of the scroll view?
Post not yet marked as solved
I am trying to read a straightforward XML file, but I can't get XMLDocument() -- Use of unresolved identifier 'XMLDocument'; did you mean 'UIDocument'?, although I can get XMLParserDelegate() to initialize with the same URL. What am I missing here?
Post not yet marked as solved
I wanted to enumerate into a photoslibrary to see which image files I have im/exported and used in other places. When I create a directory enumerator, the directory attributes show that I've gotten ahold of the photoslibrary directory, but the enumerator.nextObject returns nil. I see that there are options like DirectoryEnumerator.Options.skipsPackageDescendents, but setting or unsetting the bit has no effect that I can see.The same code works OK with ordinary directories, like ~/Pictures. Unix commands like "find" and "ls" have no problems with iterating into a photoslibrary, so am I condemned to run "find" and read its stdout, rather than using FileManger goodness?
I've got a trivial MacOS script:#!/bin/sh
# OpenInSafari.sh
open /tmp/GitNAVs.htmlthat I try to invoke with:let bundle = Bundle.main
if let script = bundle.url(forResource: "OpenInSafari", withExtension: "sh") {
do {
let scriptTask = try NSUserScriptTask(url: script)
scriptTask.execute(completionHandler: nil)
} catch {
print(error)
}But I get:Error Domain=NSCocoaErrorDomain Code=259 "The file “OpenInSafari.sh” couldn’t be opened because it isn’t in the correct format."The *.sh file was created by Xcode and is where it's supposed to be in the bundle with read and write permissions. Any clue what's the problem?Thanks,Richard
Post not yet marked as solved
I'm reimplmenting a QuickTime video application using AVKit and AVPlayerView using Xcode 10 on a late 2013 MacBook Pro. The app gets a notification containing the name of the video file to open and the time to which to scroll the video display. All this happens correctly, but when I attempt to scroll the video using a two-finger trackpad gesture, the video jumps back to the start. If I reposition the video time by "replaying" the notification, (but not reopening the file itself), then the two-finger gesture now correctly scrolls the video.Any clue to where to start looking? The same "Play" button is the firstResponder in each case.Richard
I am reading a video file frame by frame, but not displaying it. I createt the following relevant objects:AVURLAssetAVAssetTrackAVAssetReaderAVAssetReaderTrackOutputbut none of these gives me the (pixel x pixel) size ot the data returned by assetReaderTrackOutput.copyNextSampleBuffer()The AVAssetTrack has a naturalSize variable, but it isn't in pixels; some of my video has a natural size of 853x480, but the pixel size is 704x480 according to the Finder. (Right now I just kludge the width value.) Is there a robust way to get this information? I can see that it is buried in the metadata, but why is it not exposed in any of the AV objects I'm using?
Post not yet marked as solved
I have an app that uses AVFoundation to read and analyze frame-by-frame a *.mov file. The app, last built in 2015, would not run on my 2013 Powerbook on macOS 10.14.1, so I tried it on a iMac running 10.13.3. Often, but not always, the app would hang after reading about 26,750 frames from a ~54,000 frame video file. It wouldn't quit, it just hung. At least once, Terminal didn't respond to a control-C abort.When I tried rebuilding and debugging on Xcode 10, I got a burst of console warnings at about frame 26,750. Most were of the formGVA warning: addNewReferenceEntry_MMCO mmcoFunc1 not found requested 11, curr = 12orGVA warning: addNewReferenceEntry_MMCO force release 7where the first version also had 16,17 and 0,1 as the argumentsand the second version also had release of 5 and 1Subsequent files would be processed correctly, with no warnings.Any clue what is producing these warnings, and what I can do to prevent them?
Post not yet marked as solved
For years(~2014), I've had a macOS Terminal app that reads and analyzes a *.mov file frame by frame. When I updated from macOS 10.13.3 to 10.14.1, it started crashing. When I began debugging, I found that an AVAssetTrack which I had extracted from an AVURLAsset, previously kept a reference to that Asset in the track.asset variable, but now had nil for its track.asset. Should I have retained the AVURLAsset? Or what?
Post not yet marked as solved
As I was updating my system from the AppStore, I was offered Xcode 10, which I accepted. After Mojave was installed, I still have Xcode 9.4.1 and the AppStore says I'm up to date. If I go to developer.apple.com and find the Xcode 10 page, a button says "Open", but it opens Xcode 9. Any idea where the inconsistency resides?Richard Nerf
Post not yet marked as solved
I have created a document-based project with a storyboard containing an NSScrollView, where I changed the class of the NSClipView's child to NSImageView. As part of the debugging, I have added to the NSImageView a couple of NSButtons with some standard NSImage icons.The NSImage is being created from a *.png file using a trivial "override func read(from url: URL, ofType typeName: String)" in the Document.swift. I know the image is being created properly, since it will display in the NSButtons when I set their .image var. However, neither my image, nor the button icon images will display in the NSImageView within the scroll view.I know the scroll view is functioning, since I can pinch & zoom, and the buttons will move and resize as expected.I thought that an app that read and displayed an image in a scroll view would be trivial to write. Once again, reality is nastier than my imagination.Thanks,Richard
Post not yet marked as solved
I'm using Swift to write a simple iOS app to list my weight data, based losely on the "Fit" Objective-C example app. At first, I verified that I could get read permission to the data, but when I tried to do a query, the system complained that I needed write permission as well. I thought this odd, but I went back and included the NSHealthUpdateUsageDescription entry in the Info.plist file. Now the app asks for update permission, but not for read permission, and the query returns no records. Any clues?Richard
I've got a Swift app that looks at a timeline of occurences. I'd like to be able pop up the Calendar.app, focused on the date a user has selected within the app.Nothing fancy -- no data comes back from Calendar...Using ScriptingBridge, I could imagine just running [on the CalendarApplication that is returned from let calendarApp = SBApplication(url: url)]the method:- (void) viewCalendarAtDate:(NSDate *)at;or perhaps passing a string to the following script:on showDateInCalendar(s as string) tell application "Calendar" view calendar at date s end tellend showDateInCalendarThere seem to be lebenty different ways to approach this problem, but they all seem documented in the style of you-have-to-have-understood-this-once-before-you-can-understand-it-again.Any suggestions, or am I doomed to the command line and Python scripts?Richard