cashapp / accessibilitysnapshot Goto Github PK
View Code? Open in Web Editor NEWEasy regression testing for iOS accessibility
License: Apache License 2.0
Easy regression testing for iOS accessibility
License: Apache License 2.0
Since VoiceOver supports iterating through lines via the accessibility rotor, it can be important to be able to distinguish between line wrapping in the snapshots due to the layout of the legend and line wrapping with explicit newlines. I think it would be beneficial to show an inline icon (e.g. using images in attributed strings) in the accessibility description depicting any explicit newline characters.
Since I am running my tests in another language than English, the resulting snapshot is mixed-language. Should I add German to Localizable.strings and if so how did you get the Strings for English? Just from experience or do you have a sample app that allows you to extract all the Strings from?
It will be nice to support https://github.com/pointfreeco/swift-snapshot-testing for taking the screenshots itβs a 100% Swift library and itβs already have SPM support.
SnapshotTestCase
and record new reference images. Make sure to choose a simulator that's included in the virtual image for our CI jobs.When creating a new file in Xcode the copyright text on top of the file will differentiate between the text in the other files. With adding a IDETemplateMacros.plist
file this could be generated correctly and would help new contributors with this. The file could be added to the workspace of the example project: <WorkspaceName>.xcworkspace/xcshareddata/IDETemplateMacros.plist
, so it would be shared across the workspace.
I could try this out and add a PR for that, if you want. What do you think about that?
(For more information about that, you can read this article: https://oleb.net/blog/2017/07/xcode-9-text-macros/ I think this is still the same way to go even if the post is already three years old.)
By updating ElementSectionViewController
in the demo app to use preset configurations instead of a multi-step configuration process, we could simplify the process of auditing new versions against the snapshot tests in ElementSelectionTests
. Currently you need to look at the test code to see how each case is configured. By changing to a set of named configurations, we could make this much more streamlined.
I am using the imageWithSmartInvert
with isRecording = true/false
and the test always fails
assertSnapshot(matching: view, as: .imageWithSmartInvert)
when i run with isRecording=true
it generated the Black/Golden image.
when i run it with isRecording=false
the snapshot does not match reference.
.
.
.
failed - Snapshot does not match reference.
"/Users/abc/Library/Developer/CoreSimulator/Devices/ACC40F6C-1421-4D1B-9008-20C0EB7B81EE/data/Containers/Bundle/Application/14408B86-E69D-4390-9E8D-0A050D98F518/SmallAppHost.app/PlugIns/SnapshotTests.xctest/SnapshotTestsBundle.bundle/Snapshots/Tests/testLoading.1.png"
@+
"/Users/abc/Library/Developer/CoreSimulator/Devices/ACC40F6C-1421-4D1B-9008-20C0EB7B81EE/data/Containers/Data/Application/E4FA0EDE-9A8C-4B4A-9E7B-4FDDBDF81EDB/tmp/Tests/testLoading.1.png"
Newly-taken snapshot does not match reference.
.
.
.
i tried opening the image in above locations - both are same -
can you please have a look?
Views that are very large in at least one dimension (reproduced with a view that was 2870 pt tall) will silently fail to render when using the .drawHierarchyInRect
rendering mode, producing an empty snapshot of the view.
cc @efirestone
This would require us to increase the minimum target to iOS 13 (dropping iOS 12 support) and drop CocoaPods support.
In iOS 16, VoiceOver no longer states "actions available" for elements that have custom accessibility actions. We should remove this text from the legend.
As of iOS 13, the accessibilityUserInputLabels
array can be used as an alternative to a singular accessibilityLabel
on an accessibility element. As per the dev docs, iOS respects this array as first priority and falls back to the standard label if it returns as empty or nil.
In our app, we have a button that sets this property with an array of two strings if the property is available. When AX snapshots are taken, they do not load the label(s) from the array. I would expect the snapshot to either reflect the first element (in our case, "Fast Forward") or to show some representation of the array, but it instead shows the underlying button name forward15
as if we never set any AX info.
The app itself handles the array fine (we use it for voice input) and if I set the usual accessibilityLabel
the snapshot shows that label. Tests are running on iOS 14.3.
Would it make sense to consider the userInputLabels when parsing for the accessibilityLabel? ie:
if #available(iOS 13.0, *) {
var userInputLabelsTagged: String = ""
for i in 0..<accessibilityUserInputLabels.count {
if i > 0 {
userInputLabelsTagged += " "
}
userInputLabelsTagged += "[\(i)]"
userInputLabelsTagged += accessibilityUserInputLabels[i]
}
if !userInputLabelsTagged.isEmpty {
accessibilityDescription = userInputLabelsTagged
}
}
That would produce something like:
"[0]some primary label text [1]some secondary explanation of the label text"
The iOSSnapshotTestCase dependency is currently set to ~> 6.0
but the latest available version is 8.0.0, which makes it impossible to incorporate AccessibilitySnapshot into projects that are already using the latest version of iOSsnapshotTestCase without downgrading. I'm very keen to incorporate AccessibilitySnapshot into a project I'm working on, but this is currently a blocker.
More than happy to submit a PR, though I noticed that there's an open PR (#72) for adding iOSsnapshotTestCase support via SPM, which might be a better place to make this change?
Specifically the Swift version of the snapshot method in the iOSSnapshotTestCase
subspec. The Objective-C version of this behaves correctly.
To reproduce:
func testView() {
let view = UIView(frame: .init(x: 0, y: 0, width: 100, height: 100))
SnapshotVerifyWithInvertedColors(view, identifier: "someIdentifier")
}
When run in record mode, this should produce a reference image named testView_someIdentifier_
followed by the device properties. Instead it produces a reference image named testView_
(omitting the identifier).
GitHub Actions has deprecated their 10.15 runners:
We have until the end of the month to migrate off.
We currently show a single column legend for wide views. When a view gets wide enough (as seen in this snapshot) we could split the legend into multiple columns, to avoid having so much empty space in the snapshot image.
This could potentially cause some issues with wrapping, resulting in messy diffs, but I think the benefits in reduced snapshot size might be worth it. Curious if anyone has thoughts on this.
Something seems to have changed from iOS 13 to iOS 14 with SwiftUI ScrollViews.
When snapshotting a SwiftUI View with a ScrollView (only iOS 14), no Accessibility Nodes can be found. I tried to debug this and found differences in the objects when the SwiftUI View had a ScrollView and when it had no ScrollView.
iOS 13: Elements inside a ScrollView are of type NSObject
iOS 14: Elements inside a ScrollView are of type UIView (SwiftUI_UIShapeHitTestingView)
This results in no accessibility nodes for the iOS 14 snapshot.
iOS 13 | iOS 14 |
---|---|
I've tried to fix this but honestly I have no idea how π
I've also added a draft PR #34 with two new snapshot tests where you can see the resulting snapshot for these scenarios. If there are any more information needed to look into this, I'm happy to help.
When you have draggable elements, such as a table view with reorderable rows in edit mode, the drag button has a "draggable" descriptor and a hint explaining the behavior ("Double tap and hold, wait for the sound, then drag to re-arrange").
First off can I say what a great library you've built already - it's a problem I've been looking to solve for a while now and this is making some great progress to what I've pictured!
My one problem at the moment is that we personally use a different snapshot library (specifically this one from Pointfree!) and this library is, presumably because it's what you use, is tied to Uber/Facebooks Snapshot Library.
I'm curious if you would be open to breaking the project down some? I picture some sort of "core" module which is responsible for all the accessibility hierarchy parsing and even some of the image creation but then I think it would be nice if the snapshot library specific code could go in a separate modules?
So you may end up with the following specs (in a pods world):
AccessibilitySnapshot
which is the core parsing and image generation (but no snapshot dependencies)
AccessibilitySnapshot/iOSSnapshotTestCase
which is the Uber/FB specific extensions
and the community could work together to supply a
AccessibilitySnapshot/Pointfree
which is the Pointfree extensions?
Open to discussion/thoughts on all of this though!
Currently snapshots only reflect traits that affect the elements' spoken description. However there are a lot of traits that don't add anything to the description, but affect the behavior of assistive technologies in important ways. We should add icons to the markers to reflect the behavior controlled by these traits. In particular, I think we should add support for:
.summaryElement
.allowsDirectInteraction
.updatesFrequently
.causesPageTurn
.playsSound
.startsMediaSession
VoiceOver now appears to read attributed strings in segments based on attribute ranges. You can see this behavior through the following example:
let attributedText = NSMutableAttributedString(string: "Hello world")
attributedText.addAttributes([.kern: 1], range: NSRange(location: 9, length: 1))
let label = UILabel()
label.attributedText = attributedText
VoiceOver will read this as Hello wor, l, d
. This reproduces for a variety of attributes such as kerning, foreground color, and background color.
I was able to reproduce this on iOS 15.1.1, but not on iOS 13.3. As far as I know this is a new issue on iOS 15, but it's possible it applies to 14 as well.
There are two main chunks of work for supporting custom actions:
Add custom actions to the AccessibilityMarker
struct and populate them when parsing the accessibility hierarchy.
Show a list of custom actions in the AccessibilitySnapshotView.LegendView
.
The missing tolerance
was mentioned here #15 but no PR was added yet. I'll leave this here while I'm experimenting with the implementation to fix failing snapshots on my CI.
The CI build on iOS 13 seems to be flaking on a few specific tests:
testTabBars()
- Item B: 3 items. Tab. 2 of 4.
+ Item B. Tab. 2 of 4.
- Item C: A. Tab. 3 of 4.
+ Item C. Tab. 3 of 4.
testStepper()
- Decrement. Button.
+ minus. Button.
- Increment. Button.
+ plus. Button.
testStepperAtMin()
- Decrement. Dimmed. Button.
+ minus. Dimmed. Button.
- Increment. Button.
+ plus. Button.
This started after bumping the minimum iOS version to 12.0 (#27). I've seen this flake twice now.
iOS 13 introduced a .semanticGroup
container type. This one is a bit more complicated than other container types since it impacts grouping depending on the current navigation style, so we should coordinate with reflecting that in the snapshots as well.
From testing on iOS 14.0.1, the actual description around tab bars differs slightly from what shows up in tests:
UITabBarController
), some of the element descriptions no longer indicate the position in the tab bar (i.e. "x of y").As I can see the framework is heavily relying on SnapshotTesting why not adding this functionality as a PR to the existing framework and creating a new one.
Someone told me about this library, and they sent me a screenshot of some example usage, and it told a super compelling story. Classic "a picture is worth a thousand words" stuff. I think if the readme had a sample screenshot at the top, it would make the use of this repo a lot easier to understand.
The order of parsed accessibility markers is in the wrong order. See screenshot, where the section headings are parsed as being at the bottom of the view. This is a SwiftUI view with no animations. Is there a workaround for this?
The order is important to me as I use the accessibility markers to create a markdown representation of the view.
The two extending functions on Snapshotting
for UIView
and UIViewController
in SnapshotTesting+Accessibility.swift
which make it possible to modify the content size category for the snapshot should be public, so they can be used outside the target.
Issue exists in Version 0.4.0 of AccessibilitySnapshot
Hi! Thank you for the great tool!
I would like to create text-based strategy that can be used to create non-fragile tests for VoiceOver. I described this idea in this discussion.
How do you think is it possible to create it as a part of your instrument above of core module? Or I should create the different one that use just core part?
What kind of difficulties do you see?
Setting an element's accessibilityActivationPoint
to infinity is a useful way to avoid having the element pass touch events to any views when it's activated.
element.accessibilityActivationPoint = .init(x: CGFloat.infinity, y: .infinity)
This currently causes an exception when AccessibilitySnapshotView
tries to position the image view representing the activation point.
<unknown>:0: error: -[TestClass testAccessibility] : failed: caught "CALayerInvalidGeometry", "CALayer position contains NaN: [nan nan]"
(
0 CoreFoundation 0x000000011cbef1bb __exceptionPreprocess + 331
1 libobjc.A.dylib 0x000000011b7f7735 objc_exception_throw + 48
2 CoreFoundation 0x000000011cbef015 +[NSException raise:format:] + 197
3 QuartzCore 0x000000011ac941a3 _ZN2CA5Layer12set_positionERKNS_4Vec2IdEEb + 141
4 QuartzCore 0x000000011ac83293 -[CALayer setPosition:] + 57
5 UIKitCore 0x0000000127e12ef5 -[UIView setCenter:] + 268
...
)
Need to add support for Carthage integration. Now it is missing.
The accessibility description for elements in a data table appears to have subtly changed on iOS 13. The differences can be seen in the following tests:
The most obvious change here is that the accessibilityValue
comes before the row and column numbers. I'm not sure where the "5 of 1" came from.
accessibilityValue
from before is applied to all cells.testDataTableWithUndefinedRows
accessibilityValue
from before is applied to all cells.I think the row number in the first cell is coming from an overflow of NSNotFound + 1
. This feels like a regression in VoiceOver, but we should try to match the behavior of VoiceOver in our descriptions as close as possible.
When importing into a project with Require Only App-Extension-Safe API = YES
, the use of UIApplication.shared is not allowed. This is used in AccessibilityHierarchyParser.swift to determine the user interface layout direction.
This information can be determined by using let userInterfaceLayoutDirection = root. effectiveUserInterfaceLayoutDirection
(available as of iOS 10.0), and this is a better way of determining layout direction that is specific to each UIView that is analyzed.
We should change the default subspec from FBSnapshotTestCase
to SnapshotTesting
. SnapshotTesting
has a deployment target of iOS 11, while we're currently at iOS 10, so it probably makes sense to bump our deployment target in parallel to this change.
iOS 14 added the ability to set a custom image for a UIAccessibilityCustomAction
. We should replace the down arrow icon for custom actions in the snapshot images with the appropriate icon for Switch Control (a box containing either the specified image or the first character of the action's name).
Currently we have individual snapshot tests for each configuration of a switch. Changing over to a single snapshot of the SwitchControlViewController
from the demo app would make it easier to audit new iOS versions.
This'll include changes to the GitHub Actions workflow. We'll want to double check that everything still works.
We may also want to explore the new DocC feature set to ensure we have good coverage across the board, and offer up some basic guides on how to use the package.
CIImage
renders a blank image when applying a CIFilter
to larger images, which results in blank accessibility content in the accessibility snapshot images. This appears to be iOS 13 specific and does not apply to iOS 12 or 14.
Now that iOSSnapshotTestCase supports SPM in 7.0.0 we can update the package manifest to include a product for using AccessibilitySnapshot with iOSSnapshotTestCase as the snapshot engine.
I think at this point we can safely bump the minimum supported iOS version to 12.0. The oldest test device I have is now running iOS 12.4.8, so I won't be able to test on iOS 11 anymore. And dropping iOS 11 support gives us the option of running all of our CI builds through GitHub Actions.
If consumers still need to support iOS 10 or 11, they can remain on the current version (0.3.2).
I have no idea how feasible it would be to build this, given the requirement for hosted tests, but I'd love if I could wrap my SwiftUI previews in some utility from this library that would let me see my accessibility labels alongside my view while developing, instead of after the fact when snapshot testing.
SnapshotTesting supports SwiftUI views so should be a case of adding the relevant extension to our spec code. This'll avoid our users from having to first wrap the view in a UIHostingController simplifying it's usage.
The snapshot shows an empty description but in reality iOS reads out Text Field
or Text Field. Double tap to edit.
. This is likely due to text field has undocumented trait that is not being handled in the library.
The framework currently contains a partial implementation of dynamic type snapshotting. See the SnapshotVerify(_:at:)
method for the entry point, which is currently marked as internal
so as not to expose it in the public API. The TextAccessibilityTests
class contains a sample of how this can be used.
Snapshotting dynamic type would allow for the same view to be tested at multiple content size categories in the same test, for example:
SnapshotVerify(view, at: .extraSmall, identifier: "XS")
SnapshotVerify(view, at: .extraExtraExtraLarge, identifier: "XXXL")
The current implementation works for some configurations, but the logic to invalidate the current layout doesn't yet match what happens when you change the context size category on device (for example in control center while the app is open). This causes the test results to differ from what it would actually look like for a user to change the setting on device.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.