Giter Club home page Giter Club logo

accessibilitysnapshot's People

Contributors

a-25 avatar actions-user avatar akaduality avatar davidbrunow avatar dependabot[bot] avatar eliperkins avatar fbernutz avatar fruitcoder avatar jhneves avatar kruegermj avatar luispadron avatar meherkasam avatar n8chur avatar nickentin avatar ra1028 avatar royalpineapple avatar sherlouk avatar zeveisenberg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

accessibilitysnapshot's Issues

Add ability to show newlines in accessibility descriptions

Since VoiceOver supports iterating through lines via the accessibility rotor, it can be important to be able to distinguish between line wrapping in the snapshots due to the layout of the legend and line wrapping with explicit newlines. I think it would be beneficial to show an inline icon (e.g. using images in attributed strings) in the accessibility description depicting any explicit newline characters.

Do you want more Localizations?

Since I am running my tests in another language than English, the resulting snapshot is mixed-language. Should I add German to Localizable.strings and if so how did you get the Strings for English? Just from experience or do you have a sample app that allows you to extract all the Strings from?

Add snapshot tests for iOS 17 and audit results

  • Add an iOS 17 device to the list in SnapshotTestCase and record new reference images. Make sure to choose a simulator that's included in the virtual image for our CI jobs.
  • Update the build scripts and CI config to include an iOS 17 build
  • Audit the reference images against VoiceOver on a device running iOS 17 and file issues for any changes/regressions where the behavior doesn't match

Add `IDETemplateMacro` to generate correct header in new files

When creating a new file in Xcode the copyright text on top of the file will differentiate between the text in the other files. With adding a IDETemplateMacros.plist file this could be generated correctly and would help new contributors with this. The file could be added to the workspace of the example project: <WorkspaceName>.xcworkspace/xcshareddata/IDETemplateMacros.plist, so it would be shared across the workspace.

I could try this out and add a PR for that, if you want. What do you think about that?

(For more information about that, you can read this article: https://oleb.net/blog/2017/07/xcode-9-text-macros/ I think this is still the same way to go even if the post is already three years old.)

Update ElementSectionViewController to use preset configurations

By updating ElementSectionViewController in the demo app to use preset configurations instead of a multi-step configuration process, we could simplify the process of auditing new versions against the snapshot tests in ElementSelectionTests. Currently you need to look at the test code to see how each case is configured. By changing to a set of named configurations, we could make this much more streamlined.

Using `imageWithSmartInvert` with `isRecording = true/false` - fails everytime

I am using the imageWithSmartInvert with isRecording = true/false and the test always fails

assertSnapshot(matching: view, as: .imageWithSmartInvert)

My original image is:
original

when i run with isRecording=true it generated the Black/Golden image.
when i run it with isRecording=false the snapshot does not match reference.

.
.
.
failed - Snapshot does not match reference.

"/Users/abc/Library/Developer/CoreSimulator/Devices/ACC40F6C-1421-4D1B-9008-20C0EB7B81EE/data/Containers/Bundle/Application/14408B86-E69D-4390-9E8D-0A050D98F518/SmallAppHost.app/PlugIns/SnapshotTests.xctest/SnapshotTestsBundle.bundle/Snapshots/Tests/testLoading.1.png"
@+
"/Users/abc/Library/Developer/CoreSimulator/Devices/ACC40F6C-1421-4D1B-9008-20C0EB7B81EE/data/Containers/Data/Application/E4FA0EDE-9A8C-4B4A-9E7B-4FDDBDF81EDB/tmp/Tests/testLoading.1.png"

Newly-taken snapshot does not match reference.
.
.
.

i tried opening the image in above locations - both are same -
1
2

can you please have a look?

Update SnapshotTesting

This would require us to increase the minimum target to iOS 13 (dropping iOS 12 support) and drop CocoaPods support.

Snapshots do not reflect the use of accessibilityUserInputLabels

As of iOS 13, the accessibilityUserInputLabels array can be used as an alternative to a singular accessibilityLabel on an accessibility element. As per the dev docs, iOS respects this array as first priority and falls back to the standard label if it returns as empty or nil.

In our app, we have a button that sets this property with an array of two strings if the property is available. When AX snapshots are taken, they do not load the label(s) from the array. I would expect the snapshot to either reflect the first element (in our case, "Fast Forward") or to show some representation of the array, but it instead shows the underlying button name forward15 as if we never set any AX info.

The app itself handles the array fine (we use it for voice input) and if I set the usual accessibilityLabel the snapshot shows that label. Tests are running on iOS 14.3.

Should it consider userInputLabels in addition (if available) ?

var accessibilityDescription = accessibilityLabelOverride(for: context) ?? accessibilityLabel ?? ""

Would it make sense to consider the userInputLabels when parsing for the accessibilityLabel? ie:

        if #available(iOS 13.0, *) {
            var userInputLabelsTagged: String = ""
            for i in 0..<accessibilityUserInputLabels.count {
                if i > 0 {
                    userInputLabelsTagged += " "
                }
                userInputLabelsTagged += "[\(i)]"
                userInputLabelsTagged += accessibilityUserInputLabels[i]
            }
            if !userInputLabelsTagged.isEmpty {
                accessibilityDescription = userInputLabelsTagged
            }
        }

That would produce something like:
"[0]some primary label text [1]some secondary explanation of the label text"

Update iOSSnapshotTestCase dependency

The iOSSnapshotTestCase dependency is currently set to ~> 6.0 but the latest available version is 8.0.0, which makes it impossible to incorporate AccessibilitySnapshot into projects that are already using the latest version of iOSsnapshotTestCase without downgrading. I'm very keen to incorporate AccessibilitySnapshot into a project I'm working on, but this is currently a blocker.

More than happy to submit a PR, though I noticed that there's an open PR (#72) for adding iOSsnapshotTestCase support via SPM, which might be a better place to make this change?

Snapshotting with invert colors doesn't respect snapshot identifier

Specifically the Swift version of the snapshot method in the iOSSnapshotTestCase subspec. The Objective-C version of this behaves correctly.

To reproduce:

func testView() {
    let view = UIView(frame: .init(x: 0, y: 0, width: 100, height: 100))
    SnapshotVerifyWithInvertedColors(view, identifier: "someIdentifier")
}

When run in record mode, this should produce a reference image named testView_someIdentifier_ followed by the device properties. Instead it produces a reference image named testView_ (omitting the identifier).

Improve layout of snapshot container for super wide views

We currently show a single column legend for wide views. When a view gets wide enough (as seen in this snapshot) we could split the legend into multiple columns, to avoid having so much empty space in the snapshot image.

This could potentially cause some issues with wrapping, resulting in messy diffs, but I think the benefits in reduced snapshot size might be worth it. Curious if anyone has thoughts on this.

SwiftUI Views with ScrollViews for iOS 14 don't record Voice Over Descriptions

Something seems to have changed from iOS 13 to iOS 14 with SwiftUI ScrollViews.

When snapshotting a SwiftUI View with a ScrollView (only iOS 14), no Accessibility Nodes can be found. I tried to debug this and found differences in the objects when the SwiftUI View had a ScrollView and when it had no ScrollView.

iOS 13: Elements inside a ScrollView are of type NSObject

Screenshot Screen Shot 2020-10-14 at 09 33 54

iOS 14: Elements inside a ScrollView are of type UIView (SwiftUI_UIShapeHitTestingView)

Screenshot Screen Shot 2020-10-14 at 09 31 24

This results in no accessibility nodes for the iOS 14 snapshot.

iOS 13 iOS 14
testSimpleSwiftUIWithScrollViewConfiguration 375x812-13-3-3x testSimpleSwiftUIWithScrollViewConfiguration 375x812-14-0-3x

I've tried to fix this but honestly I have no idea how πŸ™ˆ

I've also added a draft PR #34 with two new snapshot tests where you can see the resulting snapshot for these scenarios. If there are any more information needed to look into this, I'm happy to help.

Missing description for draggable trait

When you have draggable elements, such as a table view with reorderable rows in edit mode, the drag button has a "draggable" descriptor and a hint explaining the behavior ("Double tap and hold, wait for the sound, then drag to re-arrange").

Discuss: Remove tight coupling on Uber library to improve flexibility

First off can I say what a great library you've built already - it's a problem I've been looking to solve for a while now and this is making some great progress to what I've pictured!

My one problem at the moment is that we personally use a different snapshot library (specifically this one from Pointfree!) and this library is, presumably because it's what you use, is tied to Uber/Facebooks Snapshot Library.

I'm curious if you would be open to breaking the project down some? I picture some sort of "core" module which is responsible for all the accessibility hierarchy parsing and even some of the image creation but then I think it would be nice if the snapshot library specific code could go in a separate modules?

So you may end up with the following specs (in a pods world):
AccessibilitySnapshot which is the core parsing and image generation (but no snapshot dependencies)
AccessibilitySnapshot/iOSSnapshotTestCase which is the Uber/FB specific extensions

and the community could work together to supply a
AccessibilitySnapshot/Pointfree which is the Pointfree extensions?

Open to discussion/thoughts on all of this though!

Snapshots should reflect non-descriptive behavioral traits

Currently snapshots only reflect traits that affect the elements' spoken description. However there are a lot of traits that don't add anything to the description, but affect the behavior of assistive technologies in important ways. We should add icons to the markers to reflect the behavior controlled by these traits. In particular, I think we should add support for:

  • .summaryElement
  • .allowsDirectInteraction
  • .updatesFrequently
  • .causesPageTurn
  • .playsSound
  • .startsMediaSession

VoiceOver reads attributed strings in multiple segments on iOS 15

VoiceOver now appears to read attributed strings in segments based on attribute ranges. You can see this behavior through the following example:

let attributedText = NSMutableAttributedString(string: "Hello world")
attributedText.addAttributes([.kern: 1], range: NSRange(location: 9, length: 1))

let label = UILabel()
label.attributedText = attributedText

VoiceOver will read this as Hello wor, l, d. This reproduces for a variety of attributes such as kerning, foreground color, and background color.

I was able to reproduce this on iOS 15.1.1, but not on iOS 13.3. As far as I know this is a new issue on iOS 15, but it's possible it applies to 14 as well.

Add custom actions to accessibility markers

There are two main chunks of work for supporting custom actions:

  1. Add custom actions to the AccessibilityMarker struct and populate them when parsing the accessibility hierarchy.

  2. Show a list of custom actions in the AccessibilitySnapshotView.LegendView.

Add precision and tolerance apis

The missing tolerance was mentioned here #15 but no PR was added yet. I'll leave this here while I'm experimenting with the implementation to fix failing snapshots on my CI.

CI builds for iOS 13 are flaking

The CI build on iOS 13 seems to be flaking on a few specific tests:

testTabBars()
- Item B: 3 items. Tab. 2 of 4.
+ Item B. Tab. 2 of 4.
- Item C: A. Tab. 3 of 4.
+ Item C. Tab. 3 of 4.

testStepper()
- Decrement. Button.
+ minus. Button.
- Increment. Button.
+ plus. Button.

testStepperAtMin()
- Decrement. Dimmed. Button.
+ minus. Dimmed. Button.
- Increment. Button.
+ plus. Button.

This started after bumping the minimum iOS version to 12.0 (#27). I've seen this flake twice now.

Snapshots should reflect semantic groups

iOS 13 introduced a .semanticGroup container type. This one is a bit more complicated than other container types since it impacts grouping depending on the current navigation style, so we should coordinate with reflecting that in the snapshots as well.

Tab bar descriptions are incorrect on iOS 14 and later

From testing on iOS 14.0.1, the actual description around tab bars differs slightly from what shows up in tests:

  • The first item in the tab bar is prepended with "Tab Bar".
  • In the case of fake tab bars (i.e. not using UITabBarController), some of the element descriptions no longer indicate the position in the tab bar (i.e. "x of y").

Add snapshot tests for iOS 16 and audit results

  • Add iOS 16 to our build script and record new reference images
  • Update the CI config to include an iOS 16 build
  • Audit the reference images against VoiceOver on a device running iOS 16 and file issues for any changes/regressions where the behavior doesn't match

Readme should include screenshot example

Someone told me about this library, and they sent me a screenshot of some example usage, and it told a super compelling story. Classic "a picture is worth a thousand words" stuff. I think if the readme had a sample screenshot at the top, it would make the use of this repo a lot easier to understand.

Incorrect sort order for SwiftUI Form in iOS 16

The order of parsed accessibility markers is in the wrong order. See screenshot, where the section headings are parsed as being at the bottom of the view. This is a SwiftUI view with no animations. Is there a workaround for this?
The order is important to me as I use the accessibility markers to create a markdown representation of the view.

testExample MainA11y

Text-based strategy

Hi! Thank you for the great tool!

I would like to create text-based strategy that can be used to create non-fragile tests for VoiceOver. I described this idea in this discussion.

How do you think is it possible to create it as a part of your instrument above of core module? Or I should create the different one that use just core part?

What kind of difficulties do you see?

Snapshotting a view with an infinite activation point results in exception

Setting an element's accessibilityActivationPoint to infinity is a useful way to avoid having the element pass touch events to any views when it's activated.

element.accessibilityActivationPoint = .init(x: CGFloat.infinity, y: .infinity)

This currently causes an exception when AccessibilitySnapshotView tries to position the image view representing the activation point.

<unknown>:0: error: -[TestClass testAccessibility] : failed: caught "CALayerInvalidGeometry", "CALayer position contains NaN: [nan nan]"
(
	0   CoreFoundation                      0x000000011cbef1bb __exceptionPreprocess + 331
	1   libobjc.A.dylib                     0x000000011b7f7735 objc_exception_throw + 48
	2   CoreFoundation                      0x000000011cbef015 +[NSException raise:format:] + 197
	3   QuartzCore                          0x000000011ac941a3 _ZN2CA5Layer12set_positionERKNS_4Vec2IdEEb + 141
	4   QuartzCore                          0x000000011ac83293 -[CALayer setPosition:] + 57
	5   UIKitCore                           0x0000000127e12ef5 -[UIView setCenter:] + 268
        ...
)

Data table descriptions are incorrect on iOS 13 and later

The accessibility description for elements in a data table appears to have subtly changed on iOS 13. The differences can be seen in the following tests:

testDataTable

  • The final element now reads as "B5: Value. Row 5. Column 2. Dimmed. Button. Heading. Link. Adjustable. Image. Search Field. 5 of 1." / "Hint"

The most obvious change here is that the accessibilityValue comes before the row and column numbers. I'm not sure where the "5 of 1" came from.

testDataTableWithHeaders

  • The first element in the table now reads as "B1: B1 Value. C1: C1 Value. A2: A2 Value. A3: A3 Value. A1: A1 Value. Row Header. Column Header. Row 1. Column 1."
  • The addition of the "Row Header" label is applied to all cells the first three columns.
  • The addition of the "Column Header" label is applied to all cells the first three rows.
  • The movement of accessibilityValue from before is applied to all cells.

testDataTableWithUndefinedRows

  • The first element in the table now reads as "A2: A2 Value. A3: A3 Value. A1: A1 Value. Column Header. Row -9223372036854775808. Column 1"
  • The second element in the table now reads as "B2: B2 Value. B3: B3 Value. B1: B1 Value. Column Header. Column 2."
  • The addition of the "Column Header" label is applied to all cells the first three rows.
  • The movement of accessibilityValue from before is applied to all cells.

I think the row number in the first cell is coming from an overflow of NSNotFound + 1. This feels like a regression in VoiceOver, but we should try to match the behavior of VoiceOver in our descriptions as close as possible.

Remove the use of UIApplication.shared

When importing into a project with Require Only App-Extension-Safe API = YES, the use of UIApplication.shared is not allowed. This is used in AccessibilityHierarchyParser.swift to determine the user interface layout direction.

This information can be determined by using let userInterfaceLayoutDirection = root. effectiveUserInterfaceLayoutDirection (available as of iOS 10.0), and this is a better way of determining layout direction that is specific to each UIView that is analyzed.

Switch default snapshot engine over to SnapshotTesting

We should change the default subspec from FBSnapshotTestCase to SnapshotTesting. SnapshotTesting has a deployment target of iOS 11, while we're currently at iOS 10, so it probably makes sense to bump our deployment target in parallel to this change.

Snapshots should include images for custom actions

iOS 14 added the ability to set a custom image for a UIAccessibilityCustomAction. We should replace the down arrow icon for custom actions in the snapshot images with the appropriate icon for Switch Control (a box containing either the specified image or the first character of the action's name).

Add snapshot tests for iOS 15 and audit results

This'll include changes to the GitHub Actions workflow. We'll want to double check that everything still works.

We may also want to explore the new DocC feature set to ensure we have good coverage across the board, and offer up some basic guides on how to use the package.

Bump minimum supported iOS version to 12.0

I think at this point we can safely bump the minimum supported iOS version to 12.0. The oldest test device I have is now running iOS 12.4.8, so I won't be able to test on iOS 11 anymore. And dropping iOS 11 support gives us the option of running all of our CI builds through GitHub Actions.

If consumers still need to support iOS 10 or 11, they can remain on the current version (0.3.2).

Support SwiftUI previews

I have no idea how feasible it would be to build this, given the requirement for hosted tests, but I'd love if I could wrap my SwiftUI previews in some utility from this library that would let me see my accessibility labels alongside my view while developing, instead of after the fact when snapshot testing.

Add SwiftUI snapshot support

SnapshotTesting supports SwiftUI views so should be a case of adding the relevant extension to our spec code. This'll avoid our users from having to first wrap the view in a UIHostingController simplifying it's usage.

Add support for snapshotting dynamic type

The framework currently contains a partial implementation of dynamic type snapshotting. See the SnapshotVerify(_:at:) method for the entry point, which is currently marked as internal so as not to expose it in the public API. The TextAccessibilityTests class contains a sample of how this can be used.

Snapshotting dynamic type would allow for the same view to be tested at multiple content size categories in the same test, for example:

SnapshotVerify(view, at: .extraSmall, identifier: "XS")
SnapshotVerify(view, at: .extraExtraExtraLarge, identifier: "XXXL")

The current implementation works for some configurations, but the logic to invalidate the current layout doesn't yet match what happens when you change the context size category on device (for example in control center while the app is open). This causes the test results to differ from what it would actually look like for a user to change the setting on device.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.