ResearchKit

RSS for tag

Enable your iOS app to become a powerful tool for medical research with the ResearchKit open source framework.

ResearchKit Documentation

Posts under ResearchKit tag

9 Posts
Sort by:
Post not yet marked as solved
2 Replies
1.3k Views
Will Apple Vision Pro support medical imaging diagnostic software such as Horos or Osirix. I am really curious if radiologist will be able to view and manipulate. interpret exams with this headset with ease and use built in microphone for voice recognition/dictation while using hands to manipulate the imaging and also simultaneously being able to view report in field of view. This could really unlock some major potential for interpreting at least CT, Ultrasound and MRI exams. I don't think the resolution will be high enough to interpret diagnostic x-rays and definitely not mammograms due to MQSA regulations and physicist inspections requiring more detail and information. However, I want to be at the forefront of bringing in Vision Pro headsets in the medical imaging space with rel utilization in clinical practices. It may also be beneficial for patients who are curious to see their medical imaging or even as headsets to use while undergoing medical imaging outpatient biopsy procedures for ******/etc to help put them at ease during the procedures. This could really provide patient satisfaction and I think we are scratching the surface on a world of possibilities in healthcare with devices like this. Upon utilixation, I would like to creat presentation series and share information with my Radiology colleagues at national/international meetings.
Posted Last updated
.
Post not yet marked as solved
0 Replies
362 Views
Hello! I am a student at a tertiary Epilepsy Center. Our team is building an open source seizure alert app for the apple watch that will undergo a clinical trial. A well conducted clinical trial is crucial for convincing insurance to pay for apple watches, which correlates with tech adoption. This imposes some restrictions on data collection. Our use case requires that the app run continuously as seizures are not predictable. The documentation on extended runtime sessions recommended the smart alarm mode which lasts for 30 minutes. Individuals would not use an alarm system that only works 30 minutes at a time. Is there any way we could deploy the alarm continuously? The requirements for the background execution are two fold: During model development phase the app would need to continuously collect data from accelerometer, gyro-meter, magnetometer, blood oxygen and EKG. I understand these are very demanding tasks so even one sensor would be good to start with. The app would then upload this to a server over the web (perhaps using CloudKit). If continuous data collection is not possible then what is the longest guaranteed latency between samples? What amount of CPU usage would be permissible and how would it impact sampling rates? During deployment the app would need to run an ML model in the background continuously. We have chosen a light-weight model[1] that has been optimized for mobile devices. It has ~ 1.3 million parameters. What would be the maximum guaranteed latency for running a ML model in the background? Please let me know if you need any more information. Thank you. Reference: Lightweight Transformers for Human Activity Recognition on Mobile Devices, Sannara EK, François Portet, Philippe Lalanda. arXiv:2209.11750
Posted Last updated
.
Post not yet marked as solved
3 Replies
1.8k Views
Hello, I work in ophthalmology at Stanford and I am hoping to develop a tool for monitoring vision health using Apple Vision Pro. However, I am worried that my goal will not be possible due to Apple privacy concerns. I want to develop an app that can gauge eye performance, but to do so I would need precise details about the users eye movements and eye position. Is this feasible? Also, I am eager to learn more about the specifics of the eye-tracking set up, like the accuracy and the sampling rate. If you have any useful information or suggestions, I would really appreciate it, thank you!
Posted
by aberens.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
Hello all, this could surely be a newbie mistake but I'm unable to adopt this protocol to my class object I've created in a new project. The Error: "Cannot find type 'OCKSurveyTaskViewControllerDelegate' in scope" import CareKit import CareKitUI import CareKitStore import ResearchKit import UIKit import os.log final class CareFeedViewController: OCKDailyPageViewController, OCKTaskSurveyViewControllerDelegate { } However, I've been able to locate the 'OCKSurveyViewController.swift" file in the CareKit framework from SPM which has the protocol defined there, but I'm still unable to utilize it. Was the protocol only for demonstration purposes and not for use outside of the WWDC21 - CareKit Code Along?
Posted
by donthedev.
Last updated
.
Post not yet marked as solved
2 Replies
2.2k Views
Hi, We are a group of PhD student working on developing a health related app using SensorKit and AppleHealth Kit. We submited our research propsal on Feb 16 and have not heard back from apple support. We wonder that how long would be the processing time? Thanks,
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.8k Views
Hello everyone, I just started watching the WWDC21 Code Along "Build a Research and Care App, Part 1" and wanted to run the project in Xcode. However, when I clone the Repo with the recommended git clone --recurse-submodule https://github.com/carekit-apple/WWDC21-RecoverApp.git command and open the project, I get the following error: /WWDC21-RecoverApp/Recover Part 1/Recover.xcodeproj This Copy Files build phase contains a reference to a missing file 'ResearchKit.framework'. /WWDC21-RecoverApp/ResearchKit only contains 2 files (CONTRIBUTING.md and LFS-Files). I was wondering how I could get the project up and running and if there are additional steps to take, that I didn't see in the presentation / Github Readme. Thanks!
Posted
by MxFx.
Last updated
.
Post not yet marked as solved
4 Replies
3.6k Views
Hello Our app has been rejected from the app store because of 1.4 Physical Harm https://developer.apple.com/app-store/review/guidelines/#safety This app is used by the users from Kenya (Only registered users). Users can log their health based self-assessment data like blood pressure, glucose level, temperature, body weight, BMI with medical hardware devices and that will be shared with specialist (Physician or Clinical) to do the diagnosis and advice subscription accordingly. Problem: We have the above features from earlier versions but with the new version update  our app got rejected from app store stated that we have violated the app stores' guidelines (1.4.1 Physical Harm - Safety) Your app connects to external medical hardware to provide medical services. However, to be compliant with App Store Review Guideline 1.4.1, you must: Provide documentation from the appropriate regulatory organization demonstrating regulatory clearance for the medical hardware used by your app Provide documentation of a report or peer-reviewed study that demonstrates your app’s use of medical hardware works as described Our app's features: AccuCheck-Instant Glucometer: CE0088 This product fulfils the requirements of the European Directive 98/79/EC on in vitro diagnostic medical devices. b) FORA-DigitalThermometer-IR21b: CE Mark for compliance with European Directive integration over Bluetooth c) FORA-OximeterPO200: CE0123 and IEC 61000-4-3 Compliant integration over Bluetooth d) FORA-Weighing Machine-W310b: IEC/EN 61010-1, IEC/EN 61326, EN 301 489-17, EN 300 328 Compliant integration over Bluetooth e) OmronBPMachine-HEM -9210T: EC & EN Compliance integration over Bluetooth Please anyone help us to resolve this issue.
Posted
by rmehla31.
Last updated
.
Post not yet marked as solved
0 Replies
489 Views
Hi, I'm new to ResearchKit. I've encountered a couple issues which seem to be bugs, but I'm not entirely sure: I have an ORKFormStep that includes an ORKAnswerFormat.scale (scale answer format) and an ORKTextAnswerFormat (text answer format). If I have the ORKTextAnswerFormat first in the formItems array of the ORKFormStep, it displays properly. But if I have it second (after the ORKAnswerFormat.scale), the text parameter is missing. Is this a bug? With the same setup - an ORKFormStep that includes an ORKAnswerFormat.scale and an ORKTextAnswerFormat - neither answer stores a value on Firebase. By comparison, if I use ORKAnswerFormat.choiceAnswerFormat (text choice answer format), it does store a value on Firebase - worth noting that I do have to set an explicit "value" variable as NSNumber for each ORKTextChoice in this case. Do I also have to manually set a value for ORKAnswerFormat.scale and ORKTextAnswerFormat to match its input somehow? Or is that a bug? Any help would be appreciated, Wayland
Posted
by waylandf.
Last updated
.