C# predictive coding for image compression -


i've been playing huffman compression on images reduce size while maintaining lossless image, i've read can use predictive coding further compress image data reducing entropy.

from understand, in lossless jpeg standard, each pixel predicted weighted average of adjacent 4 pixels encountered in raster order (three above , 1 left). e.g., trying predict value of pixel based on preceding pixels, x, left above :

x x x x  

then calculate , encode residual (difference between predicted , actual value).

but don't if average 4 neighbor pixels aren't multiple of 4, you'd fraction right? should fraction ignored? if so, proper encoding of 8 bit image (saved in byte[]) like:

public static void encode(byte[] buffer, int width, int height) {     var tempbuff = new byte[buffer.length];      (int = 0; < buffer.length; i++)     {         tempbuff[i] = buffer[i];     }      (int = 1; < height; i++)     {         (int j = 1; j < width - 1; j++)         {             int offsetup = ((i - 1) * width) + (j - 1);             int offset = (i * width) + (j - 1);              int = tempbuff[offsetup];             int b = tempbuff[offsetup + 1];             int c = tempbuff[offsetup + 2];             int d = tempbuff[offset];             int pixel = tempbuff[offset + 1];              var ave = (a + b + c + d) / 4;             var val = (byte)(ave - pixel);             buffer[offset + 1] = val;         }     } }  public static void decode(byte[] buffer, int width, int height) {     (int = 1; < height; i++)     {         (int j = 1; j < width - 1; j++)         {             int offsetup = ((i - 1) * width) + (j - 1);             int offset = (i * width) + (j - 1);              int = buffer[offsetup];             int b = buffer[offsetup + 1];             int c = buffer[offsetup + 2];             int d = buffer[offset];             int pixel = buffer[offset + 1];              var ave = (a + b + c + d) / 4;             var val = (byte)(ave - pixel);             buffer[offset + 1] = val;         }     } } 

i don't see how reduce entropy? how compress images further while still being lossless?

thanks enlightenment

edit:

so after playing predictive coding images, noticed histogram data shows lot of +-1's of varous pixels. reduces entropy quite bit in cases. here screenshot:

enter image description here

yes, truncate. doesn't matter because store difference. reduces entropy because store small values, lot of them -1, 0 or 1. there couple of off-by-one bugs in snippet btw.


Comments

Popular posts from this blog

php - What is the difference between $_SERVER['PATH_INFO'] and $_SERVER['ORIG_PATH_INFO']? -

fortran - Function return type mismatch -

queue - mq_receive: message too long -