Skip to content

How to snapshot a view on macOS with provided screen scale? #428

Open
@darrarski

Description

@darrarski

When snapshot testing views on macOS recorded image size depends on the main display of the machine on which tests are run.

Example:

  • I want to snapshot NSView of size 640x480 points
  • When running a test on a MacBook with retina display, the snapshot image will have the size of 1280x960 pixels
  • When running the same test on the same, but closed MacBook with external, non-retina screen connected, the snapshot image will have the size of 640x480 pixels

When snapshot testing views on iOS, I can provide a display scale trait collection to force the output image scale (as described in #427). Unfortunately, I haven't found a way to snapshot test macOS views, so the test results are not dependent on the display I am using (MacBook-embedded or external monitor).

I tried to work-around this issue with the code below:

extension Snapshotting where Value == NSViewController, Format == NSImage {
  static func unscalledImage(precision: Float = 1) -> Snapshotting {
    Snapshotting<NSView, NSImage>
      .unscalledImage(precision: precision)
      .pullback { $0.view }
  }
}

extension Snapshotting where Value == NSView, Format == NSImage {
  static func unscalledImage(precision: Float = 1) -> Snapshotting {
    SimplySnapshotting<NSImage>
      .image(precision: precision)
      .pullback { $0.toImage().unscaled() }
  }
}

private extension NSView {
  func toImage() -> NSImage {
    let cacheRep = bitmapImageRepForCachingDisplay(in: bounds)!
    cacheDisplay(in: bounds, to: cacheRep)
    let image = NSImage(size: bounds.size)
    image.addRepresentation(cacheRep)
    return image
  }
}

private extension NSImage {
  func unscaled() -> NSImage {
    let image = NSImage(size: size)
    image.addRepresentation(unscaledBitmapImageRep())
    return image
  }

  func unscaledBitmapImageRep() -> NSBitmapImageRep {
    let imageRep = NSBitmapImageRep(
      bitmapDataPlanes: nil,
      pixelsWide: Int(size.width),
      pixelsHigh: Int(size.height),
      bitsPerSample: 8,
      samplesPerPixel: 4,
      hasAlpha: true,
      isPlanar: false,
      colorSpaceName: .deviceRGB,
      bytesPerRow: 0,
      bitsPerPixel: 0
    )!
    imageRep.size = size
    NSGraphicsContext.saveGraphicsState()
    NSGraphicsContext.current = NSGraphicsContext(bitmapImageRep: imageRep)
    draw(at: .zero, from: .zero, operation: .sourceOver, fraction: 1)
    NSGraphicsContext.restoreGraphicsState()
    return imageRep
  }
}

And while it fixes the mismatch of the output image size (with the code above it will always equal to point-size of the view), it does not work as expected. It looks like scaling the image introduces minor distortion which causes test failures (for example, text on the snapshot image is slightly shifted 1 px).

Is there a way of forcing the scale of a snapshot image on macOS?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions