facebookarchive / ios-snapshot-test-case

Snapshot view unit tests for iOS

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

iPhone 7 tests all fail

robseward opened this issue · comments

I have some device agnostic tests I'm running on iOS 10. While we have every test passing on an iPhone 6, they all fail on an iPhone 7 in the simulator. Looking at the Diff image, you can see a very faint outline of the design elements. My casual inspection of the two rendered images hasn't found any differences.

Here's the diff image. (If you expand this you can see ghosting in the lower right. Also I think github has processed this – I can see different ghosting patterns on the original file. If you can't see the ghosting you can detect it with Digital Color Meter):
diff_artwork_detail__should_have_the_correct_snapshot_iphone10_0_375x667 2x

The reference image, rendered on iPhone 6s in simulator:
reference_artwork_detail__should_have_the_correct_snapshot_iphone10_0_375x667 2x

The failed iPhone 7 render:
failed_artwork_detail__should_have_the_correct_snapshot_iphone10_0_375x667 2x

I'm not sure if FBSnapshot is working correctly here or if there is a bug, but it would be great to get to the bottom of why this is happening so we can handle the discrepancies.

oh... My guess would be Apple's wide gamut color profile. Maybe we can't have agnostic tests with iPhone 7 for this reason.

@robseward Mocking of UITraitCollection.displayGamut might fix the problem. But I'm not sure. Could you upload sample project to test?

I'm seeing the same issue but I'm not doing device agnostic. I had some screens on iPhone 6s that passed on all versions of iOS and now all break on iPhone 7.

When I find the time I will upload a project that reproduces, but I know some of my storyboards have different color spaces on some colors. Some have sRGB IEC61996 and some have Generic RGB. I'm not sure how this happened, other than it was accidental, but I'm wondering if that's causing the problem.

@robseward @jwilliams-handy Sample projects would be awesome!

I'm also seeing the issue. Seem to be seeing this consistently failing on tests that include image assets - looks like the comparison fails on the antialiased image edges. I've tried overriding the traitCollection's displayGamut to force SRGB on the snapshot view; tried passing a similar traitCollection to the imageNamed: method when I fetch the image from my bundle; and I've also tried turning prefersExtendedRange off on the UIImage. None of these attempts seemed to make a difference.

I can try to upload a test project later this afternoon if I don't see another one show up soon.

@Grubas7

I've forked the project and pushed to a branch here: https://github.com/nugenttyler/ios-snapshot-test-case/tree/tnugent/master/hardware_snapshot_bug

I added a test to FBSnapshotTestCaseDemoTests called testViewSnapshotWithImage that snapshots a UIImageView. You should find that the test will pass when running on the iPhone 6s iOS10 simulator, but will fail on the iPhone 7 iOS 10 simulator. I've checked in the reference image (generated using the 6s sim), but feel free to rerecord it to verify. A few things to note:

  • If I just snapshot a UIImage asset in a UIImageView the test passes on both devices.
  • If I use UIImageRenderingModeAlwaysTemplate along with a tint color then the snapshots produced by the 6s and 7 are not equivalent. ksdiff suggests that the antialiasing is using slightly different values.
  • Oddly, some color values don't cause problems (e.g., substituting 'redColor' will pass the test).

I also noticed that your testViewSnapshotWithVisualEffects fails between iOS 9 and 10, but less concerning since this is a major release difference - we bucket our snapshots by release, but not by simulator type.

Let me know if you need anything else on my end. We rely on snapshot testing heavily and I've been very happy with allowing our devs to run the tests against arbitrary simulators.

Thanks!

@nugenttyler Fantastic 🎉 ! I'll make some investigation on a days.

@Grubas7 Another data point - Found a case today where I'm not using a UIImageRenderingModeAlwaysTemplate image, but failing in a similar way. However, this time it's failing when I switch between compiling Xcode 7.3.1 and Xcode 8, both on the same iOS 9.3 version. Our CI build is still on 731, so had to track down the difference and it turns out the different Xcode versions produce slightly different images though they're running the same iOS version... My plan b is to have the failure spit out the percentage of pixels that failed and use that as a precise tolerance value for the time being. Let me know if you guys find any work-arounds on your end.

Thanks.

-Tyler.

@nugenttyler Tests also fails for reference images recorded on iOS 10.3 and then tested on 10.2 and lower. Same thing if you test in opposite direction. Investigation in progress...