Search results for

“Visual Studio Maui IOS”

109,073 results found

Post

Replies

Boosts

Views

Activity

Recommended way to Subclass NSView subclasses (NSButton, NSTextField, NSPopUpButton) for Custom Visuals
I've been wrestling with applying my custom styles (example) to the OSX desktop app I've been building, and I'm wondering what the paradigms/idioms are for doing this.I'm using X/Nibs for laying out the components on the windowsI'm setting the components' class to a custom subclass of the out-of-the-box component (NSTextField, NSButton, and NSPopUpButton for example)I'm overriding viewWillDraw in some, init? in others and making them all wantsLayerand setting attributes on those layersThis seems extremely clumsy at best, and some visual things are impossible to handle this way, like adding padding to an NSTextField.A lot of people are describing how to do certain changes to visual components in viewDidLoadwith a reference to a component, which seems very tedious if you have button/textfield styles that apply to all in the project. It seems like every visual change I've made with subclassing has been a nightmare, and there's no pattern for what visual changes are made where
Topic: UI Frameworks SubTopic: AppKit Tags:
1
0
859
Jun ’16
Xamarin iOS app release issue.
Hello, When I tried to release an iOS app to production in Xamarin (Visual studio) I got the below error. Could someone please help me how to solve this issue. I have latest version of Visual Studio for MAC and XCode version 13.0. Unable to build chain to self-signed root for signer apple development.
1
0
460
Nov ’21
Reply to Kernel panic crash with 2 external LG displays - Monterey 12.0.1 MacBook Pro (16-inch, 2019)
I am having the same issue with my Mac Studio Ultra. My current setup is 2 x LG 27UL850-W 4K monitors and 1 x Wacom Cintiq Pro 24. I have had two LG monitors plugged into OWC Thunderbolt 4 Hub, and Cintiq Pro connected directly to the Studio. To isolate the issue, I plugged in one of the LG monitors directly to the Studio and disconnected everything, and I was still having the problem. 2nd day I connected the Wacom monitor, and I did not have the same issue.
Topic: App & System Services SubTopic: Core OS Tags:
Apr ’22
IOS18 simulator not auto show visual keyboard when start with hardware connect disabled
Hello, I found in IOS18 preset iphone 16PM simulator, when I set as not connect to hardware keyboard, after the simulator restart, the visual keyboard no longer shows. The input from Mac keyboard also not work. I must open hardware keyboard connection and disable it again to show the keyboard. This is not happen on IOS17 iphone15PM simulator. Is anyone know why or how to set it as always show the visual keyboard? Thanks
1
0
339
Nov ’24
Visual Effect View's Blur doesn't work on UIView from xib
I created a custom Numpad keyboard through xib and wanted to add a blur effect to its background. So I add a Visual Effect View in xib: https://i.stack.imgur.com/QjiwP.png Main View and Visual Effect View background color is set to Default and I also tried to use Clear Color. The problem is when I initialize the Numpad, background has a light grey color without any blur effect: https://i.stack.imgur.com/LebUA.png How to add a blur effect to the Numpad so yellow square can be blurred and visible? Code for NumpadView: import UIKit class NumpadView: UIView { @IBOutlet weak var resetButton: NumpadButton! @IBOutlet weak var decimalButton: NumpadButton! var target: UITextInput? var view: UIView? init(target: UITextInput, view: UIView) { super.init(frame: .zero) self.target = target self.view = view initializeSubview() } required init?(coder: NSCoder) { super.init(coder: coder) initializeSubview() } func initializeSubview() { let xibFileName = NumpadView let view = Bundle.main.loadNibNamed(xibFileN
1
0
842
Aug ’22
Reply to iOS 18 App Intents while supporting iOS 17
Are you sure this doesn't work? (not actual code) if(#available iOS 18) { AppShortcut(intent: SearchSnippetIntent(), phrases: [ Search (.applicationName) Studio, Search (.applicationName) ], shortTitle: Search, systemImageName: magnifyingglass) } } Have you tried something like: (not actual code) @available iOS 18 struct SnippetsShortcutsAppShortcutsProvider: AppShortcutsProvider { // List all four items here } @available iOS 17 struct SnippetsShortcutsAppShortcutsProvider: AppShortcutsProvider { // List just the three iOS 17 ones here }
Topic: Machine Learning & AI SubTopic: General Tags:
Jun ’24
Visual glitches in certain apps 10.12.6 Beta 6 (16G24b)
Since upgrading to 10.12.6 Beta 6 (16G24b), certain apps have similar visual glitches where I see lots of rectangles with different colors, or sometimes all black.Some areas I've noticed it:Messages, where the conversation appears (the side pane to select a user, and the text entry field at the bottom appear normal)Screen Sharing, where I see the remote desktop (the toolbar at the top of the window appears normal)Preview, where you see the document (the side pane where you can see a preview of a page appears normal)Text looks especially strange in both Preview and Messages - I can sort of tell that the rectangles are making out letters but it's completely illegible.rdar://33246839 if anyone internal is interested in my sysdiagnose
1
0
463
Jul ’17
RealityKit visualize the virtual depth texture from post-process callback
I am using RealityKit and ARView PostProcessContext to get the sourceDepthTexture of the current virtual scene in RealityKit, using .nonAR camera mode. My experience with Metal is limited to RealityKit GeometryModifier and SurfaceShader for CustomMaterial, but I am excited to learn more! Having studied the Underwater sample code I have a general idea of how I want to explore the capabilities of a proper post processing pipeline in my RealityKit project, but right now I just want to visualize this MTLTexture to see what the virtual depth of the scene looks like. Here’s my current approach, trying to create a depth UIImage from the context sourceDepthTexture: func postProcess(context: ARView.PostProcessContext) { let depthTexture = context.sourceDepthTexture var uiImage: UIImage? // or cg/ci if processPost { print(#P Process: Post Processs BLIT) // UIImage from MTLTexture uiImage = try createDepthUIImage(from: depthTexture) let blitEncoder = context.commandBuffer.makeBlitCommandEncoder() blitEncoder?.c
0
0
775
Feb ’24
Reply to How to Fix Cracking and Popping Sound ?
In my case, increasing Bitwig Studio's Block size from the Automatic setting of 256 samples to 512 samples almost entirely fixes the issue. Oddly, the same instruments with Bitwig studio on my 2010 Mac pro don't have any crackling issues even with block size set to 32 samples.
Topic: Community SubTopic: Apple Developers Tags:
Replies
Boosts
Views
Activity
Mar ’23
Recommended way to Subclass NSView subclasses (NSButton, NSTextField, NSPopUpButton) for Custom Visuals
I've been wrestling with applying my custom styles (example) to the OSX desktop app I've been building, and I'm wondering what the paradigms/idioms are for doing this.I'm using X/Nibs for laying out the components on the windowsI'm setting the components' class to a custom subclass of the out-of-the-box component (NSTextField, NSButton, and NSPopUpButton for example)I'm overriding viewWillDraw in some, init? in others and making them all wantsLayerand setting attributes on those layersThis seems extremely clumsy at best, and some visual things are impossible to handle this way, like adding padding to an NSTextField.A lot of people are describing how to do certain changes to visual components in viewDidLoadwith a reference to a component, which seems very tedious if you have button/textfield styles that apply to all in the project. It seems like every visual change I've made with subclassing has been a nightmare, and there's no pattern for what visual changes are made where
Topic: UI Frameworks SubTopic: AppKit Tags:
Replies
1
Boosts
0
Views
859
Activity
Jun ’16
Xamarin iOS app release issue.
Hello, When I tried to release an iOS app to production in Xamarin (Visual studio) I got the below error. Could someone please help me how to solve this issue. I have latest version of Visual Studio for MAC and XCode version 13.0. Unable to build chain to self-signed root for signer apple development.
Replies
1
Boosts
0
Views
460
Activity
Nov ’21
Reply to Can't get my iPad app to NOT show in Mac App Store
Select your app in App Store Connect, then click Pricing and Availability in sidebar. You can see the iPhone and iPad Apps on Apple Silicon Mac section. Deselecting “Make this app available” to remove your iOS app from Mac App Store. — WindowsMEMZ @ Darock Studio let myEmail = memz + 1 + @ + darock.top
Replies
Boosts
Views
Activity
Oct ’24
Reply to TensorFlow is slow after upgrading to Sonoma
Same problem here. Running: tensorflow-macos 2.6.0 tensorflow-metal 0.1.1 Making Mac Studio with Somona using 20-40% GPU, as for MBP with Ventura using 80-90% GPU, on same code. And the Mac Studio with Somona 6-7 times slower!
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Kernel panic crash with 2 external LG displays - Monterey 12.0.1 MacBook Pro (16-inch, 2019)
I am having the same issue with my Mac Studio Ultra. My current setup is 2 x LG 27UL850-W 4K monitors and 1 x Wacom Cintiq Pro 24. I have had two LG monitors plugged into OWC Thunderbolt 4 Hub, and Cintiq Pro connected directly to the Studio. To isolate the issue, I plugged in one of the LG monitors directly to the Studio and disconnected everything, and I was still having the problem. 2nd day I connected the Wacom monitor, and I did not have the same issue.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Apr ’22
Reply to Android Emulator on Silicon
What about Android Studio! Does it work? How is the performance, build time?
Replies
Boosts
Views
Activity
Nov ’20
Reply to PLZ HELP!!! Enrolled, Paid, but App Under Review for Over 2 Months
You can try to request an expedited review. — WindowsMEMZ @ Darock Studio
Topic: Community SubTopic: Apple Developers Tags:
Replies
Boosts
Views
Activity
Oct ’24
IOS18 simulator not auto show visual keyboard when start with hardware connect disabled
Hello, I found in IOS18 preset iphone 16PM simulator, when I set as not connect to hardware keyboard, after the simulator restart, the visual keyboard no longer shows. The input from Mac keyboard also not work. I must open hardware keyboard connection and disable it again to show the keyboard. This is not happen on IOS17 iphone15PM simulator. Is anyone know why or how to set it as always show the visual keyboard? Thanks
Replies
1
Boosts
0
Views
339
Activity
Nov ’24
Reply to Siri doesn't ding feedback when home button held
It appears that apple is disabling that Ding-Ding as part of iOS 9. Apple watch does the same quiet but visual listening interface with the animated light as part of Watch OS 2. It does vibrate to let you know that it is listening.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Jul ’15
Visual Effect View's Blur doesn't work on UIView from xib
I created a custom Numpad keyboard through xib and wanted to add a blur effect to its background. So I add a Visual Effect View in xib: https://i.stack.imgur.com/QjiwP.png Main View and Visual Effect View background color is set to Default and I also tried to use Clear Color. The problem is when I initialize the Numpad, background has a light grey color without any blur effect: https://i.stack.imgur.com/LebUA.png How to add a blur effect to the Numpad so yellow square can be blurred and visible? Code for NumpadView: import UIKit class NumpadView: UIView { @IBOutlet weak var resetButton: NumpadButton! @IBOutlet weak var decimalButton: NumpadButton! var target: UITextInput? var view: UIView? init(target: UITextInput, view: UIView) { super.init(frame: .zero) self.target = target self.view = view initializeSubview() } required init?(coder: NSCoder) { super.init(coder: coder) initializeSubview() } func initializeSubview() { let xibFileName = NumpadView let view = Bundle.main.loadNibNamed(xibFileN
Replies
1
Boosts
0
Views
842
Activity
Aug ’22
Reply to iOS 18 App Intents while supporting iOS 17
Are you sure this doesn't work? (not actual code) if(#available iOS 18) { AppShortcut(intent: SearchSnippetIntent(), phrases: [ Search (.applicationName) Studio, Search (.applicationName) ], shortTitle: Search, systemImageName: magnifyingglass) } } Have you tried something like: (not actual code) @available iOS 18 struct SnippetsShortcutsAppShortcutsProvider: AppShortcutsProvider { // List all four items here } @available iOS 17 struct SnippetsShortcutsAppShortcutsProvider: AppShortcutsProvider { // List just the three iOS 17 ones here }
Topic: Machine Learning & AI SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jun ’24
Reply to Aspiring Developer
This one is good too: [Big Mountain Studio]
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Jun ’20
Visual glitches in certain apps 10.12.6 Beta 6 (16G24b)
Since upgrading to 10.12.6 Beta 6 (16G24b), certain apps have similar visual glitches where I see lots of rectangles with different colors, or sometimes all black.Some areas I've noticed it:Messages, where the conversation appears (the side pane to select a user, and the text entry field at the bottom appear normal)Screen Sharing, where I see the remote desktop (the toolbar at the top of the window appears normal)Preview, where you see the document (the side pane where you can see a preview of a page appears normal)Text looks especially strange in both Preview and Messages - I can sort of tell that the rectangles are making out letters but it's completely illegible.rdar://33246839 if anyone internal is interested in my sysdiagnose
Replies
1
Boosts
0
Views
463
Activity
Jul ’17
RealityKit visualize the virtual depth texture from post-process callback
I am using RealityKit and ARView PostProcessContext to get the sourceDepthTexture of the current virtual scene in RealityKit, using .nonAR camera mode. My experience with Metal is limited to RealityKit GeometryModifier and SurfaceShader for CustomMaterial, but I am excited to learn more! Having studied the Underwater sample code I have a general idea of how I want to explore the capabilities of a proper post processing pipeline in my RealityKit project, but right now I just want to visualize this MTLTexture to see what the virtual depth of the scene looks like. Here’s my current approach, trying to create a depth UIImage from the context sourceDepthTexture: func postProcess(context: ARView.PostProcessContext) { let depthTexture = context.sourceDepthTexture var uiImage: UIImage? // or cg/ci if processPost { print(#P Process: Post Processs BLIT) // UIImage from MTLTexture uiImage = try createDepthUIImage(from: depthTexture) let blitEncoder = context.commandBuffer.makeBlitCommandEncoder() blitEncoder?.c
Replies
0
Boosts
0
Views
775
Activity
Feb ’24