There's an anti alias effect the canvas applies when an image is enlarged or reduced from its original size.
What I would try is this:
Create the canvas exactly the size you need it.
Set the image background which should be exactly the same size as the canvas (I guess you want it 32x32, right?). Then check if this works.
If you set an image which is even 1 pixel different from the canvas size, that anti alias effect will be there.
One of my strongest wishes for App Inventor is for the developers to provide a way to turn this property on or off. It is a very big impediment some times.
Thanks @Italo - The original image I was trying was 32 pixels x 32 pixels and I set the canvas size to the same. @TIMAI2 I vague thought the same thing - enlarge it, and post process it to get the correct "average colour". I can two images tomorrow - one that is a pixel perfect increase of the original image, and one that is a complete gradient and it'd be good to test a work around.
Thanks everyone who is helping out with this!
Also I wonder if the 24 bit colour depth is also an issue .. even though it would only read RGB 0-255 from the canvas
I made a small picture, black and white 15x15 truss. Canva is also set up 15x15. My Samsung has 3 screen resolutions HD + 1480x720, FHD + 2220x1080, WQHD + 2960x1440. I tested on all settings. Proper pixel detection is only at the highest WQHD + resolution. I guess the phone's 1px is 1px on canva. With the reduced resolution of the phone, the pixels are squeezed and a distortion, blur is created.
This is not a file format problem because I also tested on BMP files with different color depth. The problem is the resolution of the phone and the pinching, stretching of pixels.
Ok. So there are a couple of issues here. First, the documentation is underspecified but the indexing on the Canvas (contrary to App Inventor conventions) starts at (0, 0) and goes to (width - 1, height - 1). Second, the Canvas bitmap is scaled based on the DPI of the device. This has an unfortunate side-effect of changing the behavior of the algorithm based on the screen density of the target device. For example, I ran the exact same code on an emulator representing a mdpi Android device and a xxhdpi device and got different results, both due to the fact that the scaling can result in bilinear interpolation of the pixel data and because on sufficiently high DPIs the pixels are duplicated in their entirety.
A long term solution would be to fix the Canvas API to be consistent across screen resolutions and to fit the 1-based indexing of App Inventor lists rather than 0-based indexing aligned with the Android API. However, both of these changes need to be done in a way that won't break existing apps.
Do you only need to use that 32x32 image you showed in your first post or your app will use any image in the canvas? If it's only that palette gradient, I MAY have a solution for that.
Exactly. In my opinion, correct reading of pixel colors at all resolutions is not possible now. The application user would have to experiment with the resolution.
And it would be very cool to control rgb led matrices by displaying canva images on them.
Hello all, my apologies for the extremely late reply - I had been working around hospitals (IT) and it's taken all of my time! I don't just need 32x32 images and I had a newer Android phone where it works much better (almost pixel for pixel relation). As such, I have taken the project a little further on (though my bluetooth is my issue now!).
I have got a working canvas to RGB LED example which I will post images of next. Right now it's opperating at a 64x64 resolution
Some examples for you all to see (and maybe inspire with)
I will get some more now I have a bit more time and work on this further. I think I might need to sample an image considering I will be able to make it larger than 64 pixels (currently also taking an image with the camera to reproduce on the LED strip)
Also think for a while do u think only the first pic be 255 I think the below pixel is also to be 255 as the picture be like 3-4 pixel light next below 3-4 pixel dark
So ur thought is just wrong
If u dont mind try putting the retuned values to image if u want proof both will be same
In the original image I used photoshop to make a colour gradient and then make darker bands in alternative pixels. When I try and extract the RGB values using the canvas it is working with phone resolutions. Pixel perfect, it would be top left 255 for Red and the next red value underneath should be 195. This was a test image with banding.
Nope I think no as the top left colour(red) and below colour is Same for me only then after 3-4pixel its dark red so the result should be right only approach is wrong
Is it the same when extracting it using GetBackgroundPixelColor - if so ,I think you are seeing the variance of android displays - the original image is per pixel as below:
Juan is right, to avoid errors, you need to have a region of the same colour, not just a single pixel. The Center Pixel of the region should hold the correct colour value. You also need to use something like Paint to produce the image, to avoid anti-alias, dithering and super-sampling that Photoshop etc may include automatically.