Turns out homebrew seems to have spammed hundreds of lines
eval "$(/opt/homebrew/bin/brew shellenv)"
into my ~/.zprofile which I didn't even know existed.
We're seeing the same issue in all our WebGL applications.
It definitely worked in April 2021, since then without changing any code the Safari performance went from 60fps to 2fps.
Example scene built in unity (with a camera-texture since camera input is important for our AR applications):
https://test.looc.io/forest/index.html
Post not yet marked as solved
Seeing the same issue here. Our test scene used to render at 60fps back in April 2021, now without changing any code it's at 2FPS and my whole Macbook Air freezes.
Test scene: https://test.looc.io/forest/index.html
So it seems just putting the line token preliminary:anchoring:type = "face" in was not enough.
What worked was writing a python script, importing the from pxr import Usd, Sdf, UsdGeom, Kind stuff and then creating a scene hierarchy with /Root/Scenes/Scene/Children/MyModel where the scene gets the anchor-token.
The USD Classes reference (https://graphics.pixar.com/usd/docs/api/class_usd_stage.html) helped a little but It was way more complicated then I expected it to be.
Post not yet marked as solved
Mhm, so I thought according to the Swift package Manager Team an Xcode project is not necessary to build an xcframework based on a Swift Package. - https://github.com/apple/swift-package-manager/pull/2981#issuecomment-710282803.
boris is also referencing the same demo example he did in the WWDC video Distribute binary frameworks as Swift packages - https://developer.apple.com/videos/play/wwdc2020/10147/ from 2020, so I thought this is the latest status compared to the video Binary Frameworks in Swift from 2019.
As a swift package, my project automatically gets an extension that provides the Bundle.module extension, e.g.
swift
let string = NSLocalizedString(self, tableName: nil, bundle: .module, value: "", comment: "")
which is no longer available if I wrap the code into an xcodeproj.
I thought based on the above mentioned github issue, the xcodeproj way is no longer necessary.
Post not yet marked as solved
If I comment out the .binaryTarget and only use the .target(name:) the bash output is ARCHIVE SUCCEEDED [58.086 sec] but then if I look into the archived file, it has no *.framework inside it.
Post not yet marked as solved
I agree in that glass is never 100% transparent. The problem I found with reflection is If I set the transparency to 0.2 the overall end result alpha channel will be multiplied by that value.This will seems to multiply with 0.2 even those fragments with a very strong specular/highlight reflection.(1) Reality: (a) Strong reflection in glass makes objects behind the glass invisible(b) Normal glass without any reflections, adds a little bit of it's tint, depending on thickness, to the color that is visible behind.Compare an image of a pair of glasses with plexi glass side-protectors from my desk: https://imgur.com/cA5ylB0Where there is a reflection the glas becomes basically opaque.(2) SceneKit:(a) High Opacity: Reflections are beautifully reflecting lights etc, unfortunately one can barely see the object behind the glass.(b) Low Opacity: Great visibility of objects behind the glass, but now all the reflections are very subdued.So, what I ended up doing with this is adjusting the alpha-channel of a fragment by the amount of specular lighing that fragment gets:vec3 light = _lightingContribution.specular;
float alpha = reflectivity * min(1.0, 0.33 * light.r + 0.33 * light.g + 0.33 * light.b);
_output.color.rgb *= min(1.0, (1.5 + 2 * \(minAlpha)) * alpha);
_output.color.a = (0.75 + 0.25 * \(minAlpha)) * alpha;`(We put that shader modifier into our release of https://apps.apple.com/app/id1463380262 in case you want to try yourself)@gchiste I also implemented your tip using `blendMode = .add`, which is a good alternative to the above custom shader in same cases.I agree though that it doesn't work perfectly under varying lighting conditions.
Post not yet marked as solved
I used the following vertex shader back in the days of writing your OpenGL program yourself:attribute highp vec3 inVertex;
attribute mediump vec3 inNormal;
uniform highp mat4 MVPMatrix;
uniform mediump vec3 LightDirModel;
uniform mediump vec3 EyePosModel;
uniform bool bSpecular;
uniform bool bRotate;
varying lowp float SpecularIntensity;
varying mediump vec2 RefractCoord;
const mediump float cShininess = 3.0;
//const mediump float cRIR = 1.055;//1.015
uniform mediump float cRIR;
void main()
{
// Transform position
//inVertex.z = 10.0 * inVertex.z;
gl_Position = MVPMatrix * vec4(inVertex, 1.0);
// Eye direction in model space
mediump vec3 eyeDirModel = normalize(inVertex - EyePosModel);
// GLSL offers a nice built-in refaction function
// Calculate refraction direction in model space
mediump vec3 refractDir = refract(eyeDirModel, inNormal, cRIR);
// Project refraction
refractDir = (MVPMatrix * vec4(refractDir, 0.0)).xyw;
// Map refraction direction to 2d coordinates
RefractCoord = 0.5 * (refractDir.xy / refractDir.z) + 0.5;
if(bRotate) // If the screen is rotated then rotate the uvs
{
RefractCoord.xy = 1.0 -RefractCoord.yx;
}
// Specular lighting
// We ignore that N dot L could be negative (light coming
// from behind the surface)
SpecularIntensity = 0.0;
if (bSpecular)
{
mediump vec3 halfVector = normalize(LightDirModel + eyeDirModel);
lowp float NdotH = max(dot(inNormal, halfVector), 0.0);
SpecularIntensity = pow(NdotH, cShininess);
}
}Haven't managed to get this working in the Metal/Scenekit framework though. (the above example came from an example app by Khronos).There's also this Khronos tutorial: https://en.wikibooks.org/wiki/GLSL_Programming/Unity/Curved_Glasswhich could help as a foundation to a shader in Scenekit.
Post not yet marked as solved
Installed Xcode 10 today, facing the exact same problem.The wholeARReferenceImage.referenceImages(inGroupNamed: "characters", bundle: Bundle.main)didn't work, while loading the pictures individually did, like so: let warrior = ARReferenceImage(UIImage(named: "warrior")!.cgImage!,
orientation: CGImagePropertyOrientation.up,
physicalWidth: 0.038)
let barbarian = ARReferenceImage(UIImage(named: "barbarian")!.cgImage!,
orientation: CGImagePropertyOrientation.up,
physicalWidth: 0.038)
let rogue = ARReferenceImage(UIImage(named: "rogue")!.cgImage!,
orientation: CGImagePropertyOrientation.up,
physicalWidth: 0.038)
let mage = ARReferenceImage(UIImage(named: "mage")!.cgImage!,
orientation: CGImagePropertyOrientation.up,
physicalWidth: 0.038) configuration.trackingImages = [warrior, rogue, mage, barbarian]Guess it'll be fixed in one of the next betas..