从CALayer或NSView获取图像(swift 3)
我正在寻找一种方法来渲染CALayer或NSView,并将NSImage恢复 我有一个自定义类,它是NSView的一个子类。 在开始时,我只是做了一个渐变层来覆盖NSView从CALayer或NSView获取图像(swift 3),swift,cocoa,swift3,nsview,Swift,Cocoa,Swift3,Nsview,我正在寻找一种方法来渲染CALayer或NSView,并将NSImage恢复 我有一个自定义类,它是NSView的一个子类。 在开始时,我只是做了一个渐变层来覆盖NSView class ContentView: NSView { override func draw(_ dirtyRect: NSRect) { fillGradientLayer() } func fillGradientLayer() { gradientLayer = CAGr
class ContentView: NSView {
override func draw(_ dirtyRect: NSRect) {
fillGradientLayer()
}
func fillGradientLayer() {
gradientLayer = CAGradientLayer()
gradientLayer.colors = [#colorLiteral(red: 0, green: 0.9909763549, blue: 0.7570167824, alpha: 1),#colorLiteral(red: 0, green: 0.4772562545, blue: 1, alpha: 1)].map({return $0.cgColor})
gradientLayer.startPoint = CGPoint(x: 0, y: 0.5)
gradientLayer.endPoint = CGPoint(x: 1, y: 0.5)
gradientLayer.frame = self.frame.insetBy(dx: margin, dy: margin)
gradientLayer.zPosition = 2
gradientLayer.name = "gradientLayer"
gradientLayer.contentsScale = (NSScreen.main()?.backingScaleFactor)!
self.layer?.addSublayer(gradientLayer)
}
}
在某个时候,我想从CALayers那里得到一张NSImage(或CGImage)。
我在互联网上找到了一些样本,并将其转化为:
extension CALayer {
func getBitmapImage() -> NSImage {
let btmpImgRep = NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: Int(self.frame.width), pixelsHigh: Int(self.frame.height), bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: NSDeviceRGBColorSpace, bytesPerRow: 0, bitsPerPixel: 0)
let image: NSImage = NSImage(size: self.frame.size)
image.lockFocus()
let ctx = NSGraphicsContext(bitmapImageRep: btmpImgRep!)
let cgCtxt = ctx!.cgContext
self.render(in: cgCtxt)
cgCtxt.draw(layer: CGLayer, in: CGRect)
image.unlockFocus()
return image
}
}
第一个问题是draw(layer:CGLayer,in:CGRect)方法需要一个CGLayer而不是CALayer
其次,我不知道我最终使用的代码是否会最终渲染我的图层,或者我做错了什么
谢谢你的帮助
编辑: 从CALayer获取CGImage的解决方案
extension CALayer {
func getBitmapImage() -> NSImage {
let btmpImgRep =
NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: Int(self.frame.width), pixelsHigh: Int(self.frame.height), bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: NSDeviceRGBColorSpace, bytesPerRow: 0, bitsPerPixel: 32)
let ctx = NSGraphicsContext(bitmapImageRep: btmpImgRep!)
let cgContext = ctx!.cgContext
self.render(in: cgContext)
let cgImage = cgContext.makeImage()
let nsimage = NSImage(cgImage: cgImage!, size: CGSize(width: self.frame.width, height: self.frame.height))
return nsimage
}
}
如何从NSView获取NSImage,请参见下面的答案
另外,我不太了解NSBitmapImageRep属性,因此这部分可能是错误的。以下是一些
NSView
选项:
extension NSView {
/// Get `NSImage` representation of the view.
///
/// - Returns: `NSImage` of view
func image() -> NSImage {
let imageRepresentation = bitmapImageRepForCachingDisplay(in: bounds)!
cacheDisplay(in: bounds, to: imageRepresentation)
return NSImage(cgImage: imageRepresentation.cgImage!, size: bounds.size)
}
}
extension CALayer {
/// Get `NSImage` representation of the layer.
///
/// - Returns: `NSImage` of the layer.
func image() -> NSImage {
let width = Int(bounds.width * self.contentsScale)
let height = Int(bounds.height * self.contentsScale)
let imageRepresentation = NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: width, pixelsHigh: height, bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: NSDeviceRGBColorSpace, bytesPerRow: 0, bitsPerPixel: 0)!
imageRepresentation.size = bounds.size
let context = NSGraphicsContext(bitmapImageRep: imageRepresentation)!
render(in: context.cgContext)
return NSImage(cgImage: imageRepresentation.cgImage!, size: bounds.size)
}
}
或
一些
CALayer
选项:
extension NSView {
/// Get `NSImage` representation of the view.
///
/// - Returns: `NSImage` of view
func image() -> NSImage {
let imageRepresentation = bitmapImageRepForCachingDisplay(in: bounds)!
cacheDisplay(in: bounds, to: imageRepresentation)
return NSImage(cgImage: imageRepresentation.cgImage!, size: bounds.size)
}
}
extension CALayer {
/// Get `NSImage` representation of the layer.
///
/// - Returns: `NSImage` of the layer.
func image() -> NSImage {
let width = Int(bounds.width * self.contentsScale)
let height = Int(bounds.height * self.contentsScale)
let imageRepresentation = NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: width, pixelsHigh: height, bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: NSDeviceRGBColorSpace, bytesPerRow: 0, bitsPerPixel: 0)!
imageRepresentation.size = bounds.size
let context = NSGraphicsContext(bitmapImageRep: imageRepresentation)!
render(in: context.cgContext)
return NSImage(cgImage: imageRepresentation.cgImage!, size: bounds.size)
}
}
或
也许可以尝试升级一个步骤——UIView,因为每个视图都有一个CALayer。此链接中的答案显示了如何从UIView生成PNG:。我希望这也适用于NSView。我在UIKit中见过如何使用UIView执行它,但不知道如何在mac.Ow上执行。很抱歉我无论如何都想帮忙!不相关,但我不建议从
draw
调用addSublayer
。每次渲染视图时,都将重复添加此子视图。充其量,这是低效的,从逻辑上讲,这样做是错误的。如果您要自己绘制某样东西(例如,使用核心图形或您拥有的东西),您可以使用draw
。如果要将CAShapeLayer
添加为子层,您可以在其他地方(例如,从layout
方法或类似方法)进行添加。是否有任何方法可以渲染CALayer,而不仅仅是NSView?我接受了答案,并在我的问题中添加了将CALayer渲染为NSImage的方法。@Alex-请参阅上面的修订答案,了解一些CALayer
备选方案。请注意,我正在遵循其中一些方法,以便捕获屏幕分辨率(将位图上下文设置为乘以比例因子,但重置“用户大小”以捕获DPI)。感谢@Rob提供了这一伟大的解决方案!这一行非常低效返回NSImage(cgImage:imageRepresentation.cgImage!,size:bounds.size)
,因为它创建了一个不需要的cgImage
对象。更好的let result=NSImage(大小:bounds.size);结果:addRepresentation(imageRepresentation);返回结果
NSImage只是一个包含元数据和表示的容器。图像数据本身仅存在于表示中。