Javascript 如何在图像上获得正确的像素颜色[React Native]
我正在开发一个应用程序,该应用程序旨在捕获/加载图像,然后从该图像/相机视图上的触摸坐标中获取正确的颜色。(类似于iOS的Swatches应用程序)Javascript 如何在图像上获得正确的像素颜色[React Native],javascript,android,ios,reactjs,react-native,Javascript,Android,Ios,Reactjs,React Native,我正在开发一个应用程序,该应用程序旨在捕获/加载图像,然后从该图像/相机视图上的触摸坐标中获取正确的颜色。(类似于iOS的Swatches应用程序) 我使用的库是react native camera,react native get pixel color和react native draggable 有一个逻辑问题,我现在被绊住了……我在网上研究过,但仍然迷失在丛林中 每个设备的比率都不相同,因此我无法对比率进行硬编码,因为文档中指出,比率仅适用于android,但不适用于iOS(如果没有定
我使用的库是
react native camera
,react native get pixel color
和react native draggable
有一个逻辑问题,我现在被绊住了……我在网上研究过,但仍然迷失在丛林中
每个设备的比率都不相同,因此我无法对比率进行硬编码,因为文档中指出,比率
仅适用于android,但不适用于iOS(如果没有定义,android的默认比率将为4:3)。所以我在想是不是最好不要设定它?依靠图片化更好吗?或者两者都不能?欢迎任何建议
我现在非常困惑的是计算部分,我想得到图像颜色的正确部分,尽管比率、设备屏幕大小和图像大小不同:
- 实时视图-始终从中心获取
- 静态视图-基于Dragable并获取坐标
1. renderCamView() - render either Camera or Image
2. renderCenterView() - render the center item, either is draggable in 'still' view or fixed touchable button in 'live' view to always get the center point
3. onDragEvt() - the Draggable's event to keep track the draggable item's movement
4. this.state.mainView - the state that holds width and height of the root <View /> of this screen
5. captureColor() - onPress event to get the pixel color
6. this.state.isEnabled - define it is still view / live view
1。renderCamView()-渲染摄影机或图像
2.RenderCenter视图()-渲染中心项,在“静止”视图中拖动,或在“活动”视图中固定可触摸按钮,以始终获取中心点
3.onDragEvt()-跟踪可拖动项目移动的可拖动事件
4.this.state.mainView-保存此屏幕根的宽度和高度的状态
5.captureColor()-onPress事件以获取像素颜色
6.this.state.isEnabled-定义它仍然是视图/实时视图
*很抱歉我的英语不好,如果有任何令人困惑的地方,请告诉我
renderCenterView = () => {
if(this.state.isEnabled){
const { liveImg, mainView } = this.state;
return <Draggable
x={(this.state.mainView.width/2)-20} y={(this.state.mainView.height/2)-20}
isCircle
onShortPressRelease={this.captureColor}
onDragRelease={this.onDragEvt}
minX={0}
minY={0}
maxX={this.state.mainView.width}
maxY={this.state.mainView.height}
>
<View
style={{ width:50, height:50,borderWidth:15, borderRadius: 50, borderColor:'#d1c1b6', opacity: 0.8 }}></View>
</Draggable>
}else{
return <TouchableOpacity
onPress={this.captureColor}
style={[styles.middlePoint, { top:(this.state.mainView.height/2)-20, left:(this.state.mainView.width/2)-20 }]}/>
}
}
// Logic on Draggable, to find the coordinates on image view
onDragEvt = (event, gestureState, bounds) => {
let sourceWidth = this.state.liveImg.width;
let sourceHeight = this.state.liveImg.height;
// Find view dimensions
let width = this.state.mainView.width;
let height = this.state.mainView.height;
// Assuming image has been scaled to the smaller dimension to fit, calculate the scale
let wScale = width / sourceWidth;
let hScale = height / sourceHeight;
let scale = Math.min(wScale, hScale);
// Find the offset in the direction the image doesn't fill the screen
// For ImageViews that wrap the content, both offsets will be 0 so this is redundant
let offsetX = 0;
let offsetY = 0;
if (wScale <= hScale) {
offsetY = height / 2 - (scale * sourceHeight) / 2;
} else {
offsetX = width / 2 - (scale * sourceWidth) / 2;
}
// Convert event coordinates to image coordinates
let sourceX = (gestureState.moveX - offsetX) / scale;
let sourceY = (gestureState.moveY - offsetY) / scale;
if (sourceX < 0) {
sourceX = 0;
} else if (sourceX > sourceWidth) {
sourceX = sourceWidth - 5;
}
if (sourceY < 0) {
sourceY = 0;
} else if (sourceY > sourceHeight) {
sourceY = sourceHeight - 5;
}
this.setState({ dragView: { x: sourceX, y: sourceY } });
};
getPixelByPercentage(percentage, widthOrHeight) {
return (percentage / 100) * widthOrHeight;
}
getPixel(imageData, x, y) {
try {
GetPixelColor.setImage(isIOS ? imageData.uri : imageData.base64).catch(
(err) => {
// Handle errors
console.error(err);
},
);
GetPixelColor.pickColorAt(x, y)
.then((color) => {
// from react-native-get-pixel-color
// 'color' return as HEX
})
.catch((err) => {
// Handle errors
console.error(err);
});
} catch (e) {}
}
captureColor = async () => {
if (this.state.isEnabled) {
// live view
const { dragView, liveImg } = this.state;
if (Object.keys(dragView).length !== 0) {
let sourceX = dragView.x;
let sourceY = dragView.y;
this.getPixel(liveImg.data, sourceX, sourceY);
} else {
let getPiX = this.getPixelByPercentage(50, liveImg.width);
let getPiY = this.getPixelByPercentage(50, liveImg.height);
this.getPixel(liveImg.data, getPiX, getPiY);
}
} else {
if (this.camera) {
const data = await this.camera.takePictureAsync(CAM_OPTIONS);
await Image.getSize(data.uri, (width, height) => {
console.log('W: ', width);
console.log('H: ', height);
this.setState({
capImage: {
data: data,
imageBase64: data.base64,
width: width,
height: height,
},
});
}).then((res) => {
const { capImage } = this.state;
let getPiX = this.getPixelByPercentage(50, capImage.width);
let getPiY = this.getPixelByPercentage(50, capImage.height);
this.getPixel(capImage.data, getPiX, getPiY);
});
}
}
};
1. renderCamView() - render either Camera or Image
2. renderCenterView() - render the center item, either is draggable in 'still' view or fixed touchable button in 'live' view to always get the center point
3. onDragEvt() - the Draggable's event to keep track the draggable item's movement
4. this.state.mainView - the state that holds width and height of the root <View /> of this screen
5. captureColor() - onPress event to get the pixel color
6. this.state.isEnabled - define it is still view / live view