How to use AXBrailleMapRenderer in SwiftUI to improve UX for braille users

Hello,

I am developing an app for MacOS and iOS using SwiftUI and I want to include a better experience for braille display users.

Braille display users use VoiceOver to access to the OS and all apps.

My problem: I want to show an image with data. I have no problem for VoiceOver's voice output, I use the accessibilityLabel attribute to give information using a speech output channel.

But I want to give a better format for braille output channel because deaf and blind users can find problem with the information in the accessibilityLabel value.

I found information about AXBrailleMap and AXBrailleMapRenderer but there is not any code sample and I couldn't use AXBrailleMapRenderer.

The information in the documentation seems not enough to do anything.

I tryed to use with Swift an NSView, a SwiftUI view and a NSView in ObjectiveC but I couldn't get any success.

I need to do these tasks:

* to Get the dimension of the braille display
* to print a custom string in the braille display
* to get and to set specific points up and down in the braille display to draw a little icon.

Can anybody help me?

Thanks in advance

Replies

Hey there!

The AXBrailleMap class has a property to access the dimensions of the display, represented as CGSize: https://developer.apple.com/documentation/accessibility/axbraillemap/3901390-dimensions?changes=_5

To render content on the Braille display, there are two method to achieve this:

  1. Implement accessibilityBrailleMapRenderRegion (https://developer.apple.com/documentation/accessibility/axbraillemaprenderer/3901395-accessibilitybraillemaprenderreg?changes=_5) Using this API, you can choose a region of an accessibility element which the system will render onto the braille display.

  2. Set the pins of the Braille display yourself, using accessibilityBrailleMapRenderer's map. The map object has methods to get and set the height of individual pins, or a helper method to present a CGImage directly to the Braille display. There is a small code sample here, https://developer.apple.com/documentation/accessibility/axbraillemaprenderer/3901396-accessibilitybraillemaprenderer?changes=_5

And the functions to get and set pin height are detailed here: https://developer.apple.com/documentation/accessibility/axbraillemap?changes=_5

Thanks for your feedback about the information in the documentation. I understand that the code samples are limited, I'll see if we can spruce these pages up to make it more approachable and easy to pick up. In the meantime, I encourage you to check out the class header AXBrailleMap.h in Xcode to see protocol properties and additional comments in the header which may help you get started.

Please feel free to follow up here with additional specific questions if you continue having trouble implementing this API.

Hello,

thanks for the help. I read those documentation pages too but I couldn't understand how to implement the solution.

I will show my code with the errors and I hope you will can help me better.

I am developing a video capture app for MacOS using SwiftUI.

Here is my custom view to show the preview from the camera.

Take a look to the function setupBraille().

When I try to compile the project XCode gives me these errors:

Value of type 'CameraPreviewLayout' has no member 'accessibilityBrailleMapRenderer'

Unable to infer type of a closure parameter 'map' in the current context

Here is my code:

 import SwiftUI
import AVFoundation
import Accessibility

struct CameraView: NSViewRepresentable {
    let captureSession: AVCaptureSession
    
    init(captureSession: AVCaptureSession) {
        self.captureSession = captureSession
    }
    
    func makeNSView(context: Context) -> CameraPreviewLayout {
        return CameraPreviewLayout(captureSession: captureSession)
    }
    
    func updateNSView(_ nsView: CameraPreviewLayout, context: Context) { }
}

class CameraPreviewLayout: NSView, AXBrailleMapRenderer {
    var previewLayer: AVCaptureVideoPreviewLayer?
    
    required init?(coder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }
    
    init(captureSession: AVCaptureSession) {
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        super.init(frame: .zero)
        setupLayer()
        setupBraille()
    }
    
    func setupLayer() {
        previewLayer?.frame = self.frame
        previewLayer?.contentsGravity = .resizeAspectFill
        previewLayer?.videoGravity = .resizeAspectFill
        previewLayer?.connection?.automaticallyAdjustsVideoMirroring = false
        previewLayer?.connection?.isVideoMirrored = true
        layer = previewLayer
    }
    
    func setupBraille() {
        self.accessibilityBrailleMapRenderer = { map in
            // code to manage the braille display
        }
    }
}