Post not yet marked as solved
Hi,
is there any hardware diffs between M1 & M2, without the performances?
As the title suggests, will i be able to create/test apps for both intel & arm(M1&M2) in my m1 mac with the same code in xcode?
And How can i make an universal dmg without having an intel mac? Also, correct me if i'm wrong: does 'the rosetta swaps' hamper ssd life?
Thanks in advance.
Post not yet marked as solved
I have code that has worked for many years for writing ProRes files, and it is now failing on the new M1 Max MacBook. Specifically, if I construct buffers with the pixel type "kCVPixelFormatType_64ARGB", after a few frames of writing, the pixel buffer pool becomes nil. This code works just fine on non Max processors (Intel and base M1 natively).
Here's a sample main that demonstrates the problem. Am I doing something wrong here?
// main.m
// TestProresWriting
//
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
int main(int argc, const char * argv[]) {
@autoreleasepool {
int timescale = 24;
int width = 1920;
int height = 1080;
NSURL *url = [NSURL URLWithString:@"file:///Users/diftil/TempData/testfile.mov"];
NSLog(@"Output file = %@", [url absoluteURL]);
NSFileManager *fileManager = [NSFileManager defaultManager];
NSError *error = nil;
[fileManager removeItemAtURL:url error:&error];
// Set up the writer
AVAssetWriter *trackWriter = [[AVAssetWriter alloc] initWithURL:url
fileType:AVFileTypeQuickTimeMovie
error:&error];
// Set up the track
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecTypeAppleProRes4444, AVVideoCodecKey,
[NSNumber numberWithInt:width], AVVideoWidthKey,
[NSNumber numberWithInt:height], AVVideoHeightKey,
nil];
AVAssetWriterInput *track = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
// Set up the adapter
NSDictionary *attributes = [NSDictionary
dictionaryWithObjects:
[NSArray arrayWithObjects:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_64ARGB], // This pixel type causes problems on M1 Max, but works on everything else
[NSNumber numberWithUnsignedInt:width],[NSNumber numberWithUnsignedInt:height],
nil]
forKeys:
[NSArray arrayWithObjects:(NSString *)kCVPixelBufferPixelFormatTypeKey,
(NSString*)kCVPixelBufferWidthKey, (NSString*)kCVPixelBufferHeightKey,
nil]];
/*
NSDictionary *attributes = [NSDictionary
dictionaryWithObjects:
[NSArray arrayWithObjects:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB], // This pixel type works on M1 Max
[NSNumber numberWithUnsignedInt:width],[NSNumber numberWithUnsignedInt:height],
nil]
forKeys:
[NSArray arrayWithObjects:(NSString *)kCVPixelBufferPixelFormatTypeKey,
(NSString*)kCVPixelBufferWidthKey, (NSString*)kCVPixelBufferHeightKey,
nil]];
*/
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:track
sourcePixelBufferAttributes:attributes];
// Add the track and start writing
[trackWriter addInput:track];
[trackWriter startWriting];
CMTime startTime = CMTimeMake(0, timescale);
[trackWriter startSessionAtSourceTime:startTime];
while(!track.readyForMoreMediaData);
int frameTime = 0;
CVPixelBufferRef frameBuffer = NULL;
for (int i = 0; i < 100; i++)
{
NSLog(@"Frame %@", [NSString stringWithFormat:@"%d", i]);
CVPixelBufferPoolRef PixelBufferPool = pixelBufferAdaptor.pixelBufferPool;
if (PixelBufferPool == nil)
{
NSLog(@"PixelBufferPool is invalid.");
exit(1);
}
CVReturn ret = CVPixelBufferPoolCreatePixelBuffer(nil, PixelBufferPool, &frameBuffer);
if (ret != kCVReturnSuccess)
{
NSLog(@"Error creating framebuffer from pool");
exit(1);
}
CVPixelBufferLockBaseAddress(frameBuffer, 0);
// This is where we would put image data into the buffer. Nothing right now.
CVPixelBufferUnlockBaseAddress(frameBuffer, 0);
while(!track.readyForMoreMediaData);
CMTime presentationTime = CMTimeMake(frameTime+(i*timescale), timescale);
BOOL result = [pixelBufferAdaptor appendPixelBuffer:frameBuffer
withPresentationTime:presentationTime];
if (result == NO)
{
NSLog(@"Error appending to track.");
exit(1);
}
CVPixelBufferRelease(frameBuffer);
}
// Close everything
if ( trackWriter.status == AVAssetWriterStatusWriting)
[track markAsFinished];
NSLog(@"Completed.");
}
return 0;
}
IOS Apps can run on MacOS M1 devices. We have an IOS App that we'd like to run on MacOS, but would like to alter the behaviour of the app depending on whether or not the app is running on MacOS.
Behaviour we want to tailor:
Changing tooltips from 'tap' to 'click' and 'swipe' to 'scroll'.
Selecting higher resolution images from our game assets.
Is there a way to programatically detect that the app is running on MacOS? It seems the User Interface Idiom returns IPAD.
Post not yet marked as solved
I'm working on an iPad app, which I want to be able to run properly on the M1 Mac. I am not using Mac Catalyst because I am using OpenGL, which is not available for Catalyst. Instead, I am targeting "My Mac(Designed for iPad) when I build for the M1 Mac.
I would like to know when my app has been placed in full screen mode (when running on the M1 Mac). MacOS (AppKit?) provides the "NSWindowDidEnterFullScreenNotification" to help with this.
In general, I'd love to be able to access these "NSWindow*" notifications from within my iOS app, but specifically, I get the error: "Use of undeclared identifier 'NSWindowDidEnterFullScreenNotification' when I use "[NSNotificationCenter defaultCenter] addObserver" to get notified when my app goes full screen.
Any ideas as to how I can bridge this gap between iOS and MacOS and get notifications from MacOS, without resorting to Mac Catalyst? This capability would be extremely useful.
MacOS M1 machines can run iOS applications.
We have an iOS application that runs a fullscreen metal game. The game can also run across all desktop platforms via Steam. In additional to Steam, we would like to make it available through the AppStore on MacOS. We'd like to utilise our iOS builds for this so that the Apple payment (micro-transactions) and sign-in processes can be reused.
While the app runs on MacOS, it runs in a small iPad shaped window that cannot be resized. We do not want to add iPad multitasking support (portrait orientation is not viable), but would like the window on MacOS to be expandable to full screen. Currently there is an option to make it full screen, but the metal view (MTKView) delegate does not receive a drawableSizeWillChange event for this, meaning the new resolution of the window cannot be received.
Is there another method of retrieving a window size change event in this context? What is the recommended way of enabling window resizing on MacOS but not iPad for a single iOS app?
Post not yet marked as solved
Captured from Video WWDC 2020 10603 Optimize Metal apps and games with GPU counters
How can I get information about System Level Cache?And is it inside the SOC?If not, where is it?
I have developed an iOS application which works well when run from Test Flight on my iOS device.
I would like to release the app on Apple Silicon MacOS devices, so I checked the box to specify that the App can be released on Apple Silicon MacOS devices and downloaded TestFlight on my MacOS.
When I launch my app from TestFlight on my MacOS device, the app closed immediately. I cannot locate any error messages in the Console app. What is the best way to debug the problem that's causing the app to instantly close?
Thanks.
Post not yet marked as solved
I'm using M1pro and have successfully installed Numpy with Accelerate following, and it really speedup my programs. I also ran np.test() to check the correctness and every test passed.
However, I can't install Scipy with Accelerate, since the official document said Accelerate has a LAPACK of too old version. I can't even find a scipy that can pass scipy.test(). I tried the codes below:
conda install numpy 'libblas=*=*accelerate'
conda install scipy
np.test() as fails, sp.test() can't even finish
conda install numpy 'libblas=*=*openblas'
conda install scipy
Both np.test() and sp.test() can finish, but with many failures. I believe the bugs are due to Conda.
pip install --no-binary :all: --no-use-pep517 numpy
pip install scipy
np.test() has no failure and went fast, sp.test() uses OpenBLAS and has 3 failures. This is the best version I have found.
So my question is: can we find a reliable version of scipy on M1? Considering the popularity of scipy, I think it's not a high-living expectation.
And a question for Apple: is there really a plan to upgrade the LAPACK in Accelerate?
Post not yet marked as solved
What is the policy regarding macOS apps which only support Apple Silicon?
I am looking at building something around the Virtualization framework but it appears to only support the features I want on Apple Silicon macOS devices.
Can I still publish this on Mac App Store? Is there a workaround for this if not?
Post not yet marked as solved
Due to some of what we do with the Metal API in our iOS app, we are unable use Simulator for development / testing purposes.
With the introduction of Apple Silicon, we are able to run/build to the "My Mac (Designed for iPad)" destination for our app.
However, when we try running UI tests with that destination we get the message "Cannot test target <ui_test_target> on "My Mac": UI tests are not supported ..."
Wondering if there's a way to run UI tests against the "My Mac (Designed for iPad)" destination?
Post not yet marked as solved
Hi there,
I am developing an iOS app and while building my project I run into the following error message:
Error: Trace/BPT trap: 5
I didn't find anything online to fix this problem, so I wanted to know, if anyone here might be able to help.
I also had issues with Cocoapods and my Silicon Mac, so I want to list my steps I've tried fixing:
First my setup: M1 MacBook Pro, macOS 11.1
XCode Version 12.4
Cocoapods with Pods for Firebase Swift, Auth, Firestore and Storage
Steps I tried fixing: cmd + shift + k for cleaning the build folder
closing XCode and opening Terminal using Rosetta
delete ~/Library/Developer/Xcode/Derived Data - Folder
pod deintegrate in project directory
delete Podfile.lock, app.xcworkspace, Pods directory
pod install
in app and pods projects build settings setting Excluded Architectures for any iOS Simulator SDK to arm64
setting Build Active Architecture Only to yes
convert Pods Project to Swift 5
build Pods Project
build app project
And then the attached error down below occurs.
I hope someone can help me fixing this, because I already wasted so much time on that and have a deadline on that app, because it is a project for university..
Merge swiftmodule (x86_64) log output - https://developer.apple.com/forums/content/attachment/df3ffc80-0778-49fd-b5de-0195d4670148
I read, that the error occured while serializing the class BaseViewModel, so I'll attach my code from Base.swift containing that class as well.
Base.swift - https://developer.apple.com/forums/content/attachment/4e3b16f9-bf66-496b-9da1-58fcbb6150c6
Post not yet marked as solved
Currently testing our app's viability for running on native Apple Silicon and we're noticing that UIApplication.userDidTakeScreenshotNotification is not being invoked when taking a screenshot.
NotificationCenter.default.addObserver(self, selector: #selector(screenshotTaken), name: UIApplication.userDidTakeScreenshotNotification, object: nil)
Is there another API I should be using, or is this feature just not possible running on a mac?
I trying to build older version of python such as 3.6, 3.7, 3.8 using pyenv and having build failures. I am on apple silicon and the last macOS 12.3.1. In older OS these versions would build fine.
Following is the failure I get running pyenv install 3.6.9
python-build: use openssl@1.1 from homebrew
python-build: use readline from homebrew
Downloading Python-3.6.9.tar.xz...
-> https://www.python.org/ftp/python/3.6.9/Python-3.6.9.tar.xz
Installing Python-3.6.9...
python-build: use readline from homebrew
python-build: use zlib from xcode sdk
BUILD FAILED (OS X 12.3.1 using python-build 20180424)
Inspect or clean up the working tree at /var/folders/sn/46fncn513kng4ssgwn398m5h0000gn/T/python-build.20220403110510.8666
Results logged to /var/folders/sn/46fncn513kng4ssgwn398m5h0000gn/T/python-build.20220403110510.8666.log
Last 10 log lines:
checking for --with-cxx-main=<compiler>... no
checking for clang++... no
configure:
By default, distutils will build C++ extension modules with "clang++".
If this is not intended, then set CXX on the configure command line.
checking for the platform triplet based on compiler characteristics... darwin
configure: error: internal configure error for the platform triplet, please file a bug report
make: *** No targets specified and no makefile found. Stop.
Post not yet marked as solved
Hello,
We are experiencing failing UI Tests on M1 devices (Xcode 13.3.1) while they pass well on Intel devices.
The most common errors are:
App stucks on splashcreen resulting in xctassert timeout
Scrolling is not working properly (like there is no deceleration) resulting in wrong screen position and not found elements
Notes:
Simulator is running without Rosetta
We have to exclude arm64-simulator arch support because of third party libraries
Has anyone encountered such a problem?
Thanks!
Post not yet marked as solved
For some simulation work-loads I have, I would like to use the system to its full potential and therefore use both P and E cores. Splitting the work-load into individual tasks is not easily possible (the threads communicate with each other and run in semi-lockstep). I can allocate smaller portions of the domain to the E cores (and iteratively adjust this so they take the same amount of time as the P cores).
But in order for this to work well, I need to ensure that a given thread (with its associated workload) is bound to the right type of core: *either* the performance (doing larger chunks of the domain) or the efficiency (doing smaller chunks of the domain) cores.
What's the best way to do this? So far, I don't think thread-to-core affinity has been something that was choosable in macOS.
The documentation mentioned the QoS classes, but which class(es) (or relative priorities) would I pick?
c
pthread_set_qos_class_self_np(QOS_CLASS_UTILITY, 0);
The existing classifications don't really map well, the work is user-initiated (i.e. they launched a console application), but not a GUI program. Would I use 4 threads with QOS_CLASS_UTILITY and 4 with QOS_CLASS_BACKGROUND? Would I just use UTILITY with relative priority for performance vs. efficiency cores?
Post not yet marked as solved
Hi there.
I'm trying to run objc app with Iobfs4proxy framework.
I got such error.
ld: warning: directory not found for option '-F/Users/ivan/Projects/buddy-onion/Tob/../Carthage/Build/iOS'
ld: in /Users/ivan/Projects/buddy-onion/Tob/Iobfs4proxy.framework/Iobfs4proxy(go.o), section __TEXT/__text address out of range file '/Users/ivan/Projects/buddy-onion/Tob/Iobfs4proxy.framework/Iobfs4proxy' for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Device: Air m1(2020)
OS: Monterey 12.2.1
XCode: 13.3
iOS: 15.3.1 (real device)
Post not yet marked as solved
Hi,
I'm looking for a developer to update a currently Intel-only legacy kext into a system extension so it works on M1 / ARM / Apple Silicon.
Here's the problem:
I'm a heavy user of a legacy software named ControllerMate. Unfortunately, it has been abandoned by the developer and he didn't make it open-source. The latest update was in late 2018 and the developer has stopped responding entirely and can not be reached, even to previous beta-testers. Confirmed by several power users who tried over the years.
The app basically allows creation of macros and cascades and is extremely powerful. It's used by many people in the video post production and music production space in their professional workflows, and people built individual workflows around this over the years to work faster and more efficiently.
Since it hasn't been updated, people are somewhat stuck and can't upgrade without breaking their workflows. Especially now with the new and powerful M1s, this becomes much more urgent and relevant again.
The kext might have to be reverse engineered or hacked - I know, not something people in an Apple developer forum wanna hear, but we're desperate to find somebody for this and keep this going.
I truly appreciate any input, insights and leads!
Thank you!
Post not yet marked as solved
Hello,
I faced a weird problem with my framework imported into my app. MyFramework uses some other frameworks like OpenSSL etc. I build MyFramework with the script:
FrameworkName="MyFramework"
rm -rf build/
xcodebuild archive -scheme "$FrameworkName" \
-configuration Debug -destination 'generic/platform=iOS' \
-archivePath "./build/$FrameworkName.framework-iphoneos.xcarchive" \
SKIP_INSTALL=NO \
BUILD_LIBRARIES_FOR_DISTRIBUTION=YES
xcodebuild archive -scheme "$FrameworkName" \
-configuration Debug -destination 'generic/platform=iOS Simulator' \
-archivePath "./build/$FrameworkName.framework-iphonesimulator.xcarchive" \
SKIP_INSTALL=NO BUILD_LIBRARIES_FOR_DISTRIBUTION=YES
xcodebuild archive -scheme "$FrameworkName" \
-configuration Debug -destination 'generic/platform=macOS' \
-archivePath "./build/$FrameworkName.framework-macos.xcarchive" \
SKIP_INSTALL=NO BUILD_LIBRARIES_FOR_DISTRIBUTION=YES
# Fix https://bugs.swift.org/browse/SR-14195 (caused by https://bugs.swift.org/browse/SR-898)
pattern="./build/$FrameworkName.framework-iphoneos.xcarchive/Products/Library/Frameworks/$FrameworkName.framework/Modules/$FrameworkName.swiftmodule/*.swiftinterface"
grep -rli "$FrameworkName.$FrameworkName" $pattern \
| xargs sed -i '' "s,$FrameworkName.$FrameworkName,$FrameworkName,g"
# end fix
xcodebuild -create-xcframework \
-framework "./build/$FrameworkName.framework-iphoneos.xcarchive/Products/Library/Frameworks/$FrameworkName.framework" \
-framework "./build/$FrameworkName.framework-iphonesimulator.xcarchive/Products/Library/Frameworks/$FrameworkName.framework" \
-framework "./build/$FrameworkName.framework-macos.xcarchive/Products/Library/Frameworks/$FrameworkName.framework" \
-output "./build/$FrameworkName.xcframework"
# Wait for process completion and verify result
pid=$!
wait $pid
echo "Process with PID $pid has finished with Exit status: $?"
[[ ! -d "./build/$FrameworkName.xcframework/" ]] && {
msg="[ERROR] expected ./build/$FrameworkName.xcframework/ to exist"; echo -e $msg
exit 1
}
As you can see, it is also built framework for the macOS platform.
A little digression: I prefer using Carthage to build xcframeworks, but here it doesn't work, I don't the reason.
The aforementioned build solution works great, all frameworks included in MyFramework don't have any issues when I use MyFramework in my iOS App. The problem starts with using MyFramework in macOS target App. It shows two related errors, I tried some solutions but didn't find a proper way... Errors:
blablabla/DerivedData/MyApp-hdzvzxtsnsdivzcialzikxjdwidw/Build/Products/Debug/MyFramework.framework/Modules/MyFramework.swiftmodule/arm64-apple-macos.swiftinterface:9:8: error: no such module 'OpenSSL'
import OpenSSL
And the second one:
blablabla/Helpers/Injected/Services.swift:10:8: error: failed to build module 'MyFramework' for importation due to the errors above; the textual interface may be broken by project issues or a compiler bug
import MyFramework
I can import the OpenSSL framework to the macOS app and then I faced the same problem with the next framework from MyFramework's importing list - it isn't a solution...
What causes that problem?
Thanks in advance for any help... 🥺
Post not yet marked as solved
Hi,
When debugging an iOS AUv3 extension in GarageBand (or in other host apps) on Mac OS Monterey running on an M1 Mac, there is a large number of warnings that read:
WARNING: SPI usage of '-[UINSWindow uiWindows]' is being shimmed. This will break in the future. Please file a radar requesting API for what you are trying to do.
I noticed that immediately after such a warning, the view's hitTest is called with a UIHoverEvent, and indeed moving the mouse results in more log spam. Although the very first occurrence of the warning may have a different root cause. Using a symbolic breakpoint wasn't helpful in revealing further information.
Note that I'm launching the AUv3 extension in the "Designed for iPad" mode. Project language is Objective-C. I was able to reproduce the issue with even an empty project, adding nothing beyond Xcode's built-in appex template for iOS Audio Units.
The issue doesn't seem to happen with the Apple sample code "CreatingCustomAudioEffects", which is a swift project. I also compared and matched storyboard settings between that project and my own, but to no avail.
Any pointers would be highly appreciated.
Thanks.
Post not yet marked as solved
Hello,
I'm in a big project app with unit testing and ui testing running well.
Recently, one member of team, buy a MacBook Pro M1 with Big Sur 11.1 and try to execute the ui testing with the new Xcode 12.4 (available today 27/01/21 from Mac AppStore).
When he selects any simulator with iOS 13.7 it's ok, but if he selects iOS 14.4 then fail. It doesn't matter if it's an iPad or an iPhone.
The error talks about architectures but the settings haven't any custom config depends iOS version.
Showing All Messages
MYAPPUITests-Runner.app (72942) encountered an error (Failed to load the test bundle. If you believe this error represents a bug, please attach the result bundle at /Users/user/Library/Developer/Xcode/DerivedData/MYAPP-gefifpwdxlqqbidolncfjzimagfq/Logs/Test/Test-MYAPP-2021.01.27_23-13-22-+0100.xcresult. (Underlying Error: The bundle “MYAPPUITests” couldn’t be loaded because it doesn’t contain a version for the current architecture. The bundle doesn’t contain a version for the current architecture. Try installing a universal version of the bundle. dlopen_preflight(/Users/user/Library/Developer/Xcode/DerivedData/MYAPP-gefifpwdxlqqbidolncfjzimagfq/Build/Products/Debug-iphonesimulator/MYAPPUITests-Runner.app/PlugIns/MYAPPUITests.xctest/MYAPPUITests): no suitable image found.	Did find:
/Users/user/Library/Developer/Xcode/DerivedData/MYAPP-gefifpwdxlqqbidolncfjzimagfq/Build/Products/Debug-iphonesimulator/MYAPPUITests-Runner.app/PlugIns/MYAPPUITests.xctest/MYAPPUITests: mach-o, but wrong architecture))
Why it's ok with iOS 13.7 and why not with iOS 14.4 simulator??
Can you think something ?
Regards.