There is a Android Dynamic Performance Framework,
https://developer.android.com/games/optimize/adpf which allows you to monitor device's thermal state and send performance hints to the OS, describing current workload.
This helps to consume resources effectively, while having target performance. As I can see from tracing and profiling, hints help OS scheduler to switch tasks between cores more effectively - this helps to reach performance stability between multiple runs.
I wonder, is there anything similar for iOS platform?
General
RSS for tagDelve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
1.Open the DingTalk macOS application, start a meeting, and initiate screen sharing.
2.The DingTalk app calls the [SCShareableContent getShareableContentWithCompletionHandler:] API.
3.The API takes over 40+ seconds to return a response.
I facing to many lags in pubgmobile when i m playing its not running properly
I have used the Mac M1 and M4.
Developing OpenGL projects on machines running macOS 15.2 and 13.6.
Call the OpenGL library functions of Mac.
glTexImage2D
If you use GL_LUMINANCE, GL_LUMINANCE_ALPHA, GL_ALPHA these three textures, you will get an error gl 500.
It makes me unable to draw normally on Mac.
What's the reason for this? Don't they support it?
Hey there,
I tried to install GPTK again, since I had to reinstall the OS for irrelevant reasons. But every time I try to install the tool kit, it gives me theError: apple/apple/game-porting-toolkit 1.1 did not build error. Before that error occired, I had the Openssl error, which I fixed with the rbenv version of openssl. Is there any way to fix this error? Down bellow you'll find the full error message it gave me. The specs for my Mac are (if they are helpful in any way): M1 Pro MBP 14" with 16GB Ram and 512GB SSD.
Thanks!
``Error: apple/apple/game-porting-toolkit 1.1 did not build
Logs:
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/00.options.out
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc
/Users/myuser/Library/Logs/Homebrew/game-porting-toolkit/wine64-build
If reporting this issue please do so at (not Homebrew/brew or Homebrew/homebrew-core):
https://github.com/apple/homebrew-apple/issues```
How to create a beautiful fire animation using Swift?
Which API is better to use?
Hi, I am trying to detect if all the screen are in fullscreen mode.
The current approach is to get all windows' information from CGWindowListCopyWindowInfo and then compare the frame and coordinate with the frame of NSScreen.screens.
However, there is a problem, the y position of the window seems to be relative to the screen. As it is not absolute position, I cannot compare it with the coordinate of the screen.
Does anyone know if there are other information that I can use? Or is there a way to retrieve the absolute position or screen ID from the GCWIndow object?
Dear Apple, there is still no support for WebGPU via WebView. Do you have a roadmap on when you plan to support it? That would be very helpful to release our new game.
Topic:
Graphics & Games
SubTopic:
General
I have a SPFx React application where I am printing the HTML page content using the javascript default window.print() functionality. Once I save the page as pdf from the print preview window and open it using Adobe Acrobat, the links(for eg -> Google) within the content are not clickable and appearing as plain text.
I have tried to print random pages post searching with any keywords in Google and saved the files as pdfs, but, unfortunately, the links are still not clickable there as well.
To check whether it is an Adobe Acrobat issue, I have performed the same print functionality from Android devices and shared the pdf file across the iOS devices and in that case, when opened using Adobe Acrobat, the links are appearing to be clickable.
I am wondering whether it is something related to how the default print functionality works for iPadOS and iOS devices. Any insights on this would be really helpful. Thanks!!!
Note: The links are clickable for MacOS as well as for Windows.
#ios #ipados #javascript #spfx #react
Hello,
Im trying to install it following these steps https://www.applegamingwiki.com/wiki/Game_Porting_Toolkit but i get an error with 'brew install apple/apple/game-porting-toolkit'
==> tar -xf crossover-sources-22.1.1.tar.gz --include=sources/clang/* --strip-components=2
==> cmake -G Ninja -DCMAKE_VERBOSE_MAKEFILE=Off -DCMAKE_MAKE_PROGRAM=ninja -DCMAKE_VERBOSE_MAKEFILE=On -DCMAKE_OSX_ARCHITECTUR
Last 15 lines from /Users/rafael/Library/Logs/Homebrew/game-porting-toolkit-compiler/02.cmake:
-DLLVM_INSTALL_TOOLCHAIN_ONLY=On
-DLLVM_ENABLE_PROJECTS=clang
/private/tmp/game-porting-toolkit-compiler-20250519-44600-qwrjgl/llvm
CMake Error at CMakeLists.txt:3 (cmake_minimum_required):
Compatibility with CMake < 3.5 has been removed from CMake.
Update the VERSION argument <min> value. Or, use the <min>...<max> syntax
to tell CMake that the project requires at least <min> but has been updated
to work with policies introduced by <max> or earlier.
Or, add -DCMAKE_POLICY_VERSION_MINIMUM=3.5 to try configuring anyway.
-- Configuring incomplete, errors occurred!
If reporting this issue please do so to (not Homebrew/* repositories):
apple/apple
MacOS 15.3.1
Thank you in advanced
Regards
Imagine a native macOS app that acts as a "launcher" for a Java game.** For example, the "launcher" app might use the Swift Process API or a similar method to run the java command line tool (lets assume the user has installed Java themselves) to run the game.
I have seen How to Enable Game Mode. If the native launcher app's Info.plist has the following keys set:
LSApplicationCategoryType set to public.app-category.games
LSSupportsGameMode set to true (for macOS 26+)
GCSupportsGameMode set to true
The launcher itself can cause Game Mode to activate if the launcher is fullscreened. However, if the launcher opens a Java process that opens a window, then the Java window is fullscreened, Game Mode doesn't seem to activate. In this case activating Game Mode for the launcher itself is unnecessary, but you'd expect Game Mode to activate when the actual game in the Java window is fullscreened.
Is there a way to get Game Mode to activate in the latter case?
** The concrete case I'm thinking of is a third-party Minecraft Java Edition launcher, but the issue can also be demonstrated in a sample project (FB13786152). It seems like the official Minecraft launcher is able to do this, though it's not clear how. (Is its bundle identifier hardcoded in the OS to allow for this? Changing a sample app's bundle identifier to be the same as the official Minecraft launcher gets the behavior I want, but obviously this is not a practical solution.)
In swipe-driven games, a first downward swipe starting near the home indicator can trigger Reachability, even when using preferredScreenEdgesDeferringSystemGestures = .bottom and prefersHomeIndicatorAutoHidden = true. This causes the app to slide down to half screen and breaks gameplay. Please consider an API to temporarily defer Reachability while a custom gesture is active (similar to existing system gesture deferral), without disabling accessibility globally.
Environment:
Devices: iPhones with Home Indicator (Face ID)
Why this matters:
Bottom-origin swipes are core in many games (flick shots, slingshot, physics toss, bottom sheets). Current workarounds degrade UX and discoverability, and players still accidentally trigger Reachability.
Feedback Assistant Post
Topic:
Graphics & Games
SubTopic:
General
Hello,
We are experiencing an issue with apparent lag or latency when interacting with objects in Unity using the XR Interaction Toolkit. Even when interactables are configured with the Movement Type set to Instantaneous, objects exhibit a noticeable delay when following hand movement. This issue is particularly apparent when the hand moves at higher speeds.
We would like to know: is any additional configuration required, or is there something else we need to do to resolve this?
Thank you very much for your assistance.
When using simd_inverse functions, the same input yields different results on different devices.
On the Mac mini with M4 pro and M3 ultra, the result is
But on the Mac mini of M2 ultra, the result is
Hi, I trying to use Metal cpp, but I have compile error:
ISO C++ requires the name after '::' to be found in the same scope as the name before '::'
metal-cpp/Foundation/NSSharedPtr.hpp(162):
template <class _Class>
_NS_INLINE NS::SharedPtr<_Class>::~SharedPtr()
{
if (m_pObject)
{
m_pObject->release();
}
}
Use of old-style cast
metal-cpp/Foundation/NSObject.hpp(149):
template <class _Dst>
_NS_INLINE _Dst NS::Object::bridgingCast(const void* pObj)
{
#ifdef __OBJC__
return (__bridge _Dst)pObj;
#else
return (_Dst)pObj;
#endif // __OBJC__
}
XCode Project was generated using CMake:
target_compile_features(${MODULE_NAME} PRIVATE cxx_std_20)
target_compile_options(${MODULE_NAME}
PRIVATE
"-Wgnu-anonymous-struct"
"-Wold-style-cast"
"-Wdtor-name"
"-Wpedantic"
"-Wno-gnu"
)
May be need to set some CMake flags for C++ compiler ?
Our application uses Core Image to apply custom CIFilters to still images and video. I'm running into issues when the supplied image is large enough (>4096) that the image is automatically tiled. The simplest of these to describe is a filter that performs various mirroring effects - backwards, upside-down etc.
The implementation portion of the filter provides a sampler (src) and passes this into the kernel with an roiCallback that uses the destRect, inset by -1 in both dimensions:
return [mirrorsKernel applyWithExtent:[src extent] roiCallback:^CGRect(int index, CGRect destRect) { return CGRectInset(destRect, -1, -1); }
arguments:@[src]
];
The kernel is very simple, sampling from the X coordinate equal to the src width - current coordinate:
float4 backwards(sampler image, destination dest)
{
float2 dc = dest.coord();
dc.x = image.size().x - dc.x;
return image.sample(image.transform(dc)));
}
When this runs on an image that is wider than 4096, tiling happens, with the result being that destRect is not the entire image and therefore the resulting output image is incorrect. If the ROI uses [src extent] instead of destRect, the result is correct, but this will lead to serious performance issues when src gets too large.
All of this makes sense to me. What I'd like to know is if there is a way to handle this filter's requirements for sampling from the entire source while still limiting the ROI to maintain performance? I think the answer is probably no within our current structure and performance limits. But I wanted to see if there's anything we're missing.
I am aware that the simple kernel above can be replaced with an affine transform, which is an option for backwards and upside-down mirroring. We have other kernels in this filter that perform mirroring of either half of the source image or one quadrant of the source image. In these cases, I suppose it might be possible (up to a point) to create a custom ROI that is only the portion of the source that is being mirrored. We have not attempted that yet.
Any thoughts/input appreciated, thanks!
I have this game controller connected to my M1, and the Simulator won't announced it via .GCControllerDidConnect. This works fine on iOS and macOS.
I have the simulator set to "Send Game Controller to Device" which the Simulator does. If I disable that, then I can control the simulator view. But once enabled, the Simulator doesn't tell the app about the controller.
Hello!
Brand new to the Apple developer community, so, hello everyone! I'm a game developer, we just launched our first game on PC and I'm looking to port it to ios. Time is something I'm kind of short on, and I hear it takes some jumping through hoops to get the know-how to port something to mobile. Any good sites you'd recommend for finding programmers to port your game? It's fairly simple - just a visual novel. Any and all suggestions welcome!
All the best!
Elijah
Topic:
Graphics & Games
SubTopic:
General
I’m trying to create a CGContext that matches my screen using the following code
let context = CGContext(
data: pixelBuffer,
width: pixelWidth, // 3456
height: pixelHeight, // 2234
bitsPerComponent: 10, // PixelEncoding = "--RRRRRRRRRRGGGGGGGGGGBBBBBBBBBB"
bytesPerRow: bytesPerRow, // 13824
space: CGColorSpace(name: CGColorSpace.extendedSRGB)!,
bitmapInfo: CGImageAlphaInfo.none.rawValue
| CGImagePixelFormatInfo.RGBCIF10.rawValue
| CGImageByteOrderInfo.order16Little.rawValue
)
But it fails with an error
CGBitmapContextCreate: unsupported parameter combination:
RGB
10 bits/component, integer
13824 bytes/row
kCGImageAlphaNone
kCGImageByteOrder16Little
kCGImagePixelFormatRGBCIF10
Valid parameters for RGB color space model are:
16 bits per pixel, 5 bits per component, kCGImageAlphaNoneSkipFirst
32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipFirst
32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipLast
32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedFirst
32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedLast
32 bits per pixel, 10 bits per component, kCGImageAlphaNone|kCGImagePixelFormatRGBCIF10|kCGImageByteOrder16Little
64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast
64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast
64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little
64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little
128 bits per pixel, 32 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents
128 bits per pixel, 32 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents
See Quartz 2D Programming Guide (available online) for more information.
Why is it unsupported if it matches the 6th option? (32 bits per pixel, 10 bits per component, kCGImageAlphaNone|kCGImagePixelFormatRGBCIF10|kCGImageByteOrder16Little)
End goal: to detect 3 lines, and 2 corners accurately. Trying contours but they are a bit off. Is there a way or settings in contours to detect corners and lines more accurately, maybe less an sharper edged/corner contours? Or some other API or methods please?
I would love an email please ;) thank you. 2. also an overlay/scale issue