使用iOS Swift TensorFlowLite图像分类模型输入图像时遇到问题?

使用iOS Swift TensorFlowLite图像分类模型输入图像时遇到问题?,ios,swift,firebase,tensorflow-lite,image-classification,Ios,Swift,Firebase,Tensorflow Lite,Image Classification,我一直在尝试通过Firebase云托管的ML模型将植物识别分类器添加到我的应用程序中,我已经接近了——问题是,我很确定我在某个地方弄乱了图像数据的输入。我的分类器正在根据这个分类器的输出大量产生无意义的概率/结果,我已经通过一个python脚本测试了同一个分类器,该脚本给出了准确的结果 模型的输入需要一个224x224图像,其中3个通道缩放为0,1。我已经做了所有这些,但似乎无法通过相机/图像采集器计算出CGImage。以下是处理图像输入的代码位: if let imageData = info

我一直在尝试通过Firebase云托管的ML模型将植物识别分类器添加到我的应用程序中,我已经接近了——问题是,我很确定我在某个地方弄乱了图像数据的输入。我的分类器正在根据这个分类器的输出大量产生无意义的概率/结果,我已经通过一个python脚本测试了同一个分类器,该脚本给出了准确的结果

模型的输入需要一个224x224图像,其中3个通道缩放为0,1。我已经做了所有这些,但似乎无法通过相机/图像采集器计算出CGImage。以下是处理图像输入的代码位:

if let imageData = info[.originalImage] as? UIImage {
            DispatchQueue.main.async {

                let resizedImage = imageData.scaledImage(with: CGSize(width:224, height:224))

                let ciImage = CIImage(image: resizedImage!)
                let CGcontext = CIContext(options: nil)

                let image : CGImage = CGcontext.createCGImage(ciImage!, from: ciImage!.extent)!

                guard let context = CGContext(
                    data: nil,
                    width: image.width, height: image.height,
                    bitsPerComponent: 8, bytesPerRow: image.width * 4,
                    space: CGColorSpaceCreateDeviceRGB(),
                    bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue
                ) else {
                    return
                }

                context.draw(image, in: CGRect(x: 0, y: 0, width: image.width, height: image.height))
                guard let imageData = context.data else { return }

                print("Image data showing as: \(imageData)")
                var inputData = Data()
                do {
                    for row in 0 ..< 224 {
                        for col in 0 ..< 224 {
                            let offset = 4 * (row * context.width + col)
                            // (Ignore offset 0, the unused alpha channel)
                            let red = imageData.load(fromByteOffset: offset+1, as: UInt8.self)
                            let green = imageData.load(fromByteOffset: offset+2, as: UInt8.self)
                            let blue = imageData.load(fromByteOffset: offset+3, as: UInt8.self)

                            // Normalize channel values to [0.0, 1.0].
                            var normalizedRed = Float32(red) / 255.0
                            var normalizedGreen = Float32(green) / 255.0
                            var normalizedBlue = Float32(blue) / 255.0

                            // Append normalized values to Data object in RGB order.
                            let elementSize = MemoryLayout.size(ofValue: normalizedRed)

                            var bytes = [UInt8](repeating: 0, count: elementSize)
                            memcpy(&bytes, &normalizedRed, elementSize)
                            inputData.append(&bytes, count: elementSize)
                            memcpy(&bytes, &normalizedGreen, elementSize)
                            inputData.append(&bytes, count: elementSize)
                            memcpy(&bytes, &normalizedBlue, elementSize)
                            inputData.append(&bytes, count: elementSize)

                        }
                    }
                    print("Successfully added inputData")
                    self.parent.invokeInterpreter(inputData: inputData)

                } catch let error {
                    print("Failed to add input: \(error)")
                }
            }
        }

这个问题与Firebase有什么关系?Firebase有很多不同的产品-您对Firebase有问题吗?或者如何处理CGImage数据?Firebase本身托管图像分类器-我添加了该标记,以防在Firebase上托管时调用/调用分类器时出现一些问题。
    func invokeInterpreter(inputData: Data) {
    do {
    var interpreter = try Interpreter(modelPath: ProfileUserData.sharedUserData.modelPath)
    var labels: [String] = []
        
        try interpreter.allocateTensors()
        try interpreter.copy(inputData, toInputAt: 0)
        try interpreter.invoke()
        
        let output = try interpreter.output(at: 0)
        
            switch output.dataType {
            case .uInt8:
              guard let quantization = output.quantizationParameters else {
                print("No results returned because the quantization values for the output tensor are nil.")
                return
              }
              let quantizedResults = [UInt8](output.data)
              let results = quantizedResults.map {
                quantization.scale * Float(Int($0) - quantization.zeroPoint)
              }
                
                let sum = results.reduce(0, +)
                print("Sum of all dequantized results is: \(sum)")
                print("Count of dequantized results is: \(results.indices.count)")
                
                let filename = "plantLabels"
                let fileExtension = "csv"
                guard let labelPath = Bundle.main.url(forResource: filename, withExtension: fileExtension) else {
                    print("Labels file not found in bundle. Please check labels file.")
                    return
                }
                
                do {
                      let contents = try String(contentsOf: labelPath, encoding: .utf8)
                    labels = contents.components(separatedBy: .newlines)
                    print("Count of label rows is: \(labels.indices.count)")
                    } catch {
                      fatalError("Labels file named \(filename).\(fileExtension) cannot be read. Please add a " +
                                   "valid labels file and try again.")
                    }
                
                let zippedResults = zip(labels.indices, results)
                 
                 // Sort the zipped results by confidence value in descending order.
                 let sortedResults = zippedResults.sorted { $0.1 > $1.1 }.prefix(3)
                
                 print("Printing sortedResults: \(sortedResults)")
            case .float32:
              print("Output tensor data type [Float32] is unsupported for this model.")
            default:
              print("Output tensor data type \(output.dataType) is unsupported for this model.")
              return
            }
    } catch {
        //Error with interpreter
        print("Error with running interpreter: \(error.localizedDescription)")
    }
}