Post not yet marked as solved
We deliver rendering SDKs backended with OpenGLES to our customers. recently, we recieve many feedbacks about occasional crashes on iOS devices, the crash stacks are very similar:
0 libsystem_kernel.dylib 0x00000001bba0e9e8 0x1bba08000 + 27112 __pthread_kill (in libsystem_kernel.dylib) + 8
1 libsystem_c.dylib 0x000000018be5b0b4 0x18be3c000 + 127156 abort (in libsystem_c.dylib) + 120
2 libGFXShared.dylib 0x00000001c2030600 0x1c202d000 + 13824 gfxFreeTextureLevel (in libGFXShared.dylib) + 0
3 GLEngine 0x00000001c1f5e390 0x1c1f4c000 + 74640 glTexImage2D_Exec (in GLEngine) + 1064
4 OpenGLES 0x00000001c2017f00 0x1c2015000 + 12032 glTexImage2D (in OpenGLES) + 80
The above crash does not look like an OpenGLES background running failure, is any one could share some suggestion about this? Thanks a lot!
Post not yet marked as solved
OpenGL crashes when I call 'presentRenderBuffer' in iOS 16 beta. It works fine with iOS 15 or below.
I am updating my old pre-notch OpenGLES app, and when it launches on my iPhone 13, it has black bars at the top and bottom. I was expecting it to fill the screen and go under the notch. Why is it not doing that?
When my app starts, I set my view bounds using [[UIScreen mainScreen] bounds], which is giving me dimensions of 375 x 667. The nativeBounds are supposedly 1125 x 2001.
But the iPhone 13 is 1170 x 2532 pixels.
Why do I get these numbers, where in the app settings is this determined? I can't find anything in the .plist or anywhere else...
I've read about safe areas but my app is an old-school EAGLLayer app, no storyboard. I do have a navigationController - but I'm getting this smaller display area being reported before that is even created... I can't find any documentation online about this either. Anyone know?
Post not yet marked as solved
My application freezes on a call to glTexImage2D, this is the stack trace I get from the lldb:
thread #3
frame #0: 0x00000001bb04d9b4 libsystem_kernel.dylib`mach_msg_trap + 8
frame #1: 0x00000001bb04dd60 libsystem_kernel.dylib`mach_msg + 76
frame #2: 0x00000001bdb28dd0 IOKit`io_connect_method + 440
frame #3: 0x00000001bdb28bec IOKit`IOConnectCallMethod + 236
frame #4: 0x00000001d5b1b970 IOGPU`IOGPUResourceCreate + 224
frame #5: 0x00000001d5b159f8 IOGPU`-[IOGPUMetalResource initWithDevice:remoteStorageResource:options:args:argsSize:] + 476
frame #6: 0x00000001d5b18ab8 IOGPU`-[IOGPUMetalTexture initWithBuffer:descriptor:sysMemOffset:sysMemRowBytes:vidMemSize:vidMemRowBytes:args:argsSize:isStrideTexture:] + 616
frame #7: 0x00000001d5b1925c IOGPU`-[IOGPUMetalTexture initWithPrimaryBuffer:heapIndex:bufferIndex:bufferOffset:length:descriptor:sysMemRowBytes:vidMemSize:vidMemRowBytes:args:argsSize:] + 96
frame #8: 0x000000021c45bc10 AGXMetal13_3`___lldb_unnamed_symbol4663$$AGXMetal13_3 + 468
frame #9: 0x0000000133953aa8 AppleMetalOpenGLRenderer`GLDTextureRec::loadObj() + 3156
frame #10: 0x000000013398a960 AppleMetalOpenGLRenderer`gldModifyTexSubImage + 160
frame #11: 0x000000021bd9a7e8 GLEngine`glTexImage2D_Exec + 1944
frame #12: 0x000000021bd7b8d0 libGL.dylib`glTexImage2D + 80
My questions are: how could a call to glTexImage2D freezes and cause the thread stuck forever? Why and how to solve it?
Thank you in advance, and looking forward to any suggestions
Post not yet marked as solved
This project is work for android and iOS, so I use a CAEAGLLayer to present live video which is 60fps. All code work well at iPhone11 and older devices. But at iPhone 12 and iPhone 13, it becomes strange.
The layer drops some frames. I profile with Instrument and find some drawable are waited more than 1/60 second. And after I turn on Screen Recorder, it work well. All drawables are waited less than 1/60 second. The layer present video with 60fps. After I turn off the screen recorder, it didn't work again.
Is there anyone can tell me what happened and how to workaround it?
Post not yet marked as solved
ios: 14.2 iphone xr
I download the demo from "https://developer.apple.com/documentation/metal/mixing_metal_and_opengl_rendering_in_a_view?language=objc"
it play normally.
and then i change the code as followed:
the purpose is switching "mix render" and "just opengl rendering" and re-create AAPLMetalRenderer each time when re-entry "mix render"
there come a 'bug' phenomenon: the first frame each time from "just opengl" to "mix render" will display the old picture ( it was the last picture when "mix render" to "just opengl " ) .
the first frame it will re-create the AAPLMetalRenderer and call drawToInteropTexture, but it seems that the "InteropTexture" do not 'update' yet ( or opengl's draw do not wait for metal finish rendering to 'InteropTexture' ? )
so I have a question about how metal and opengl sync??
int counter = 0 ;
bool currentMetal = false ;
- (void)draw:(id)sender
{
[EAGLContext setCurrentContext:_context];
counter++;
counter = counter % 180;
if (counter < 90)
{
bool waitForFinish = false ;
if (!currentMetal) // re-entry "mix render"
{
// create MetalRender
_metalRenderer = nil;
_metalRenderer = [[AAPLMetalRenderer alloc] initWithDevice:_metalDevice colorPixelFormat:AAPLOpenGLViewInteropPixelFormat];
[_metalRenderer useTextureFromFileAsBaseMap];
[_metalRenderer resize:AAPLInteropTextureSize];
}
currentMetal = true ;
[_metalRenderer drawToInteropTexture:_interopTexture.metalTexture waitForFinish:waitForFinish];
[_openGLRenderer draw];
}
else
{
[_metalRenderer justUpdate]; // not metal render
[_openGLRenderer justClear]; // just clean opengl's fbo
currentMetal = false ;
}
glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);
[_context presentRenderbuffer:GL_RENDERBUFFER];
}
_openGLRenderer justClear is below:
- (void) justClear
{
glBindFramebuffer(GL_FRAMEBUFFER, _defaultFBOName);
glClearColor(0.5, 0.5, 0.5, 1);
glClear(GL_COLOR_BUFFER_BIT);
}
_metalRenderer justUpdate is below:
- (void)justUpdate
{
[self updateState];
}
Hi everyone,
Running Xcode 12, I'm developing a standalone WatchOS App using Spritekit.
I'm using a simple Palette-swap shader using a SKShader with several SKAttributes.
The shader seems to work fine, however I get the following message in the console: [Metal Compiler Warning] Warning: Compilation succeeded with:
program_source:3:19: warning: unused variable 's'
constexpr sampler s(coord::normalized,
I have no idea what this means, there is no variable 's' in my code so I guess this is an issue because GLGS shader code gets compiled to Metal in the background and it goes wrong there? How do I verify this? It seems there is a memory leak associated with this error so I'm prone to solve it. Any help would be appreciated, even if it's just pointing me in a direction. (Like, should I write a new palette shader in Metal?) Thanks in advance!
The function that adds the SKShader (an extension to SKSpriteNode):
func addMultiColorShader(strokeColor: UIColor, colors:[UIColor]) {
let useColors:[UIColor] = shaderColors(colors)
var attributes:[SKAttribute] = [SKAttribute(name: "a_size", type: .vectorFloat2),
SKAttribute(name: "a_scale", type: .vectorFloat2),
SKAttribute(name: "a_alpha", type: .float)]
let colorNames:[String] = ["a_color0","a_color1","a_color2","a_color3"]
for i in 0..<colorNames.count {
attributes += [SKAttribute(name: colorNames[i], type: .vectorFloat4)]
}
let shader = SKShader(fromFile: "SKHMultiColorize", attributes: attributes)
setShaderAttributes(size: self.size, scale: CGSize(width: self.xScale, height: self.yScale), alpha: self.alpha)
setShaderColors(strokeColor: strokeColor, color1: useColors[0], color2: useColors[1], color3: useColors[2])
self.shader = shader
}
The actual shader code (OpenGL GS):
vec2 nearestNeighbor(vec2 loc, vec2 size) {
vec2 onePixel = vec2(1.0, 1.0) / size;
vec2 coordinate = floor(loc * size) / size; // round
return coordinate + (onePixel * 0.5);
}
void main()
{
vec4 colors[4];
colors[0] = a_color0;
colors[1] = a_color1;
colors[2] = a_color2;
colors[3] = a_color3;
vec4 currentColor = texture2D(u_texture, nearestNeighbor(v_tex_coord, a_size));
int colorIndex = int((currentColor.r * 5.1));
vec4 refColor = colors[colorIndex];
gl_FragColor = refColor * currentColor.a;
}
Post not yet marked as solved
Hi,i would like to know if OpenGL will still be supported in iOS 14?This info is needed to plan the migration activities accordingly.Is there a link which could provide a concrete end of life date?
Post not yet marked as solved
Hello,
I have a problem with my iOS application which displays video streaming via MobileVlcKit. this function has been available on my application for many months but, for a few days, I can't see the video stream in my application !
when I use the xCode simulator, the video stream is displayed correctly.
but, when i launch a local version or a version in testflight i get a black screen without my video stream flux !
when I run a local version via the USB cord, I see this message **** in the debugging console :
"Unable to determine our source address: This computer has an invalid IP address: 0.0.0.0"
can someone please help me?
Post not yet marked as solved
How can i catch the OpenGLES's GPU frame with Xcode or are there some other tools to do this job?
Post not yet marked as solved
I created two caeagllayers to display two independent OpenGL ES views, mainview and childview, which inherit UIView and override layerClass.
First, mainview runs CAEAGLLayer in the main thread, creates an eaglcontext, and uses cadisplaylink to refresh the view. CAEAGLLayer opaque = true for mainview.
The second childview is overlaid on the mainview. In a custom sub thread, create a caeagllayer and eaglcontext, then initialize the operation, and use cadisplaylink to refresh the view.
The whole process can run well on IOS simulator, but some exceptions will be displayed on real iphone11:
If childview's CAEAGLLayer opaque = true, after childview is created, I must touch the iPhone 11 screen before i can see OpenGLES, and then the interface displays normally.
If CAEAGLLayer opaque = false in childview, I can see the OpenGL view displayed without touch the screen, but there will be transparency problems in the interface display. If the translucent texture is displayed in childview, I can see the view displayed on mainview.
I hope I have clearly described the problem, so how can I modify it?
Thanks for any insights, examples, or suggestions。
Post not yet marked as solved
Hey guys
Recently im trying to use gpu frame capture to debug a opengl es 3.0 app, but when i choose a pixel and try to debug shader, Debug button cannot be clicked。and an error message displayed:"Shader Debugger is not supported,SDK and Xcode not aligned,The installed platform SDK is too old for the Xcode, Update your platform SDK to support the shader debugger".
And hot reload shader doesnt works,xcode says reloading shaders and it stuck.
here is my dev environment:Xcode 12.5.1,Mac OS Big Sur 11.4, iPhone8plus iOS 14.6,and SDK 14.5 installed.