I would like to integrate the new Search functionality, but have some questions on if what I'm trying to obtain is possible and if so which approach I should take to implement it.My app has thousands of entries that could be searched and revealed in the app. It's basically a database of the same category of things - none of which is tied to a specific user, these are all static and unique entries that can easily be searched using keywords. It's kind of similar to a database of medial conditions. So I want anyone to be able to search for any of them, even if they don't have the app installed. In trying to figure out how to do this, I have a few questions:Do I use NSUserActivity or the CoreSpotlight API for this? NSUserActivity allows you to specify eligibleForPublicIndexing, but I don't see a description or image property to be able to set - only a title. Also it was mentioned you can't let your activity dealloc, I surely can't hold a reference to thousands of entries. But I don't see any way to make an entry
Search results for
Popping Sound
19,349 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Dear all, First of all sorry if this post may sound simplistic but it is my first attempt to develop for any apple platform :) Is there a tool that can be used to create nice looking GUI interfaces that can actually then be imported into Xcode to code the backend of the application? I am asking since I am used to using certain .NET GUI components and tools that made this task simple for people like me that are not really that artistic. I am trying to develop a CRUD Frontend for a MySQL inhouse database. Any help or direction to good resources is greatly appreciated.
Hi there, We are planning to issue an app that will enhance our physical book sold in retail stores. It complements the book with audio snippets and drawings/pictures. The app will be free for all on the app-store, and have basic information extracted from the book. People having purchased the book can then unlock the complementary drawings and audio by entering a code which is inside the book itself. For those who didn't purchase the book, they can also unlock the same content via in-app purchase if they want. Please note that we don't plan to put any website link in the app that will allow to buy the book online. We understand this falls under guidelines 11.16, and should be fine with you. Can you please confirm/guide us? Thanks!
Visit the last tab of Xcode preferences, there is a pop up button to choose the default command line tools. You don't need to install CLT if you have Xcode installed (since Xcode 5 as I remembered). It is embedded inside Xcode. Normally, a beta Xcode would not set itself as the default CLT if a production version is installed, but in your case, it could be that you've manually installed a separate CLT.
Topic:
Developer Tools & Services
SubTopic:
General
Tags:
Thanks very much for your reply. Yes, I'll only ever by using one audio buffer (not multiple). I just wanted to make sure using AudioBufferList as I'm doing (that is, with only a single buffer) wasn't one of the places where the Swift API isn't solid yet.I did search the archives for 'AudioBufferList Swift' and read everything I could find on the topic. A lot of the discussion was on using multiple buffers with AudioBufferList rather than just one, which is a somewhat different problem (and probably a more difficult one). As I mentioned I did find a couple examples similar to mine (these I found elsewhere online), but the examples were somewhat speculative and/or offered with caveats, so I was just looking for something more authoritative.Thanks again.
Topic:
Programming Languages
SubTopic:
Swift
Tags:
This is also happening for me with my Sound Blaster X7 USB DAC. Most of the time, the output option is completely gone from the system, a reboot later it appears again. When I then play sound it is absolutely distorted. The Sound Blaster app acknowledges the connected device in both situations by the way.
Topic:
App & System Services
SubTopic:
Core OS
Tags:
What for has that to be generic? Sounds totaly like a perfect fit for a subclass.
Topic:
Programming Languages
SubTopic:
Swift
Tags:
This sounds like an issue with the EFI (or similar low level OS). This likely is updated along with other base code with each firmware release. Assuming that this, like most other Apple systems, uses NVRAM to store it's settings might get reset by both holding down the power and home button for a minute or so and then plugging the power adapter into it.The long delay between plugging the cable in and seeing any charging activity is basically just a severely depleted battery.Some things can also kick stard the rechard process (using a 10 or 12 watt power adapter). If you have an iPad charger laying around, try that. Charging from your Mac will generally be slower than the iPad charger.And while this may likely be due to the awful powermanagement in iOS 9 Beta 1, I've seen the same thing happen to brand new devices directly from the factory. (Rarely)
Topic:
App & System Services
SubTopic:
Core OS
Tags:
Weird... I just copied and pasted that into one of my projects and it worked. Perhaps try deleting your derived data? I see odd errors pop up from time to time with the Swift compiler and that will sometimes take care of it.
Topic:
App & System Services
SubTopic:
Core OS
Tags:
I'm building an iOS application using CloudKit.It should allow a user to add, edit and delete Products in the cloud. The user can organize these in Folders.These folders are just other Record Types in cloudkit. There is no limit to the levels of folder the user can use, as any folder can just hold a CKReference to it's parent Folder.All CloudKit communication in my app happens in a dedicated, custom CloudKitController Class.This all works, but stops working after a while for no clear reason.When I test my app, I don't even user folders that are multiple levels deep. However, after testing it a while (creating, editing, deleting products for up to a week), All deleted records seem to reappear on CloudKit. A couple of notes on this:When I reset my CloudKit dashboard and start all over again, it works perfect. No code changes made.Obviously, I'm constantly editing my code as the app is in development. However, I generally don't edit the data types in my code that are to be stored in CloudKit. When I do, this iss
Hi, I am using Core Location to stream position data to me, and I just checked (print statement): the region is valid and as expected. I'm almost wondering whether it's a bug in the simualtor. Unfortunately the sim doesn't include standard Apple programs like their maps app, which could show me if maps are working at all on it. I'm a bit hesitant to try running it on my actual watch, but I might have to break down and do it. I even tried turning off streaming location data and only asking for the map once, but still no go--just a spinning wheel. Thanks for the help. I thought maybe something else needed to be done, but sounds like I just need to display my region.
Topic:
App & System Services
SubTopic:
Core OS
Tags:
We are developing an application that would benefit greatly from getting two channel audio input - one from the base mic, the other from the front mic (or any other mic). With these two inputs we can do application specific noise cancelling.Is this possible with the new APIs introduced for iOS 9? We have C/Objective-C experience, etc.I looked into this for iOS 7/8 - it does not seem possible. The only noise cancellation I can get is by calling Apple's built in one, which is completely wrong for our purposes.NSArray* inputs = [[AVAudioSession sharedInstance] availableInputs];Will list multiple inputs, but I can't get record from the base mic and another mic at the same time. We are willing to write our own AUAudioUnit if that would help.--T
Here's a simple master playlist to illustrate the question:#EXTM3U #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID=audio_low,LANGUAGE=en,NAME=English,AUTOSELECT=YES,DEFAULT=YES,URI=audio_low.m3u8 #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID=audio_high,LANGUAGE=en,NAME=English,AUTOSELECT=YES,DEFAULT=YES,URI=audio_high.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=100000,CODECS=mp4a.40.2,avc1.4d401e,AUDIO=audio_low video.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=200000,CODECS=mp4a.40.2,avc1.4d401e,AUDIO=audio_high video.m3u8During playback, we can look at AVPlayerItemAccessLogEvent.URI to identify the currently playing EXT-X-STREAM-INF (in this case it will always be video.m3u8). Is there any similar way to identify the currently playing EXT-X-MEDIA?If the audio codecs were different (let's say AAC and AC-3), we might be able to find out by looking at the codec in AVAssetTrack.formatDescriptions. But that's hacky and doesn't work if the codecs are the same like in the example.Ideally, there would be a way to get ei
It works fine in simulator. I can see that it picks up audio.However, when debugging on the device, the blue indicator is stuck to the left and doesn't pickup any audio. iPhone did ask for permission to use Microphone and that was allowed.Did it work for anyone so far?
A part of an application I'm working on uses audio units. In particular, it uses an audio unit with subtype kAudioUnitSubType_AudioFilePlayer. Is it possible to set the playback rate of such an audio unit? If so, how?I thought the parameter kAudioUnitParameterUnit_Rate was what I was looking for, but I've been unable to get that to work.Any help would be appreciated.