G
Guest
I'd like to write a small applet to display astronomical images. As
customary in astronomy, for us an image is an array img(x,y) of real
values in physical units, and colour is a mere way of representing them.
We usually scale the values img(x,y) according to user choice (linear
scale, log scale, histogram equalization) to normalized values i(x,y)
e.g. in the range [0,255] or [0.0,1.0] and use the latter as index into
a lookup table (LUT) of R G B values (the LUT is also chosen by the
user).
I have code which does that in Xlib, in Postscript (using [/Indexed
/DeviceRGB 255 {LUT} ] setcolorspace) and in RSI IDL (using
device,decomposed=0, and then setting tvlct,r,g,b where r,g,b are arrays
of R G B levels). I'd like to do it in java as well in the same way. I
am confused by the API documentation about images and color spaces which
seems overshooting for my needs.
I tried therefore to define a small LUT of 10 colours (as a test) with
one-byte depth and then to use this code to fill the image
IndexColorModel cm = new IndexColorModel(8,10, red,gre,blu) ;
BufferedImage img = new
BufferedImage(30,10,BufferedImage.TYPE_BYTE_INDEXED,cm)
for (int jx=0; jx<30 ; jx++) {
for (int jy=0; jy<10 ; jy++) {
img.setRGB(jx,jy,jy) ;
}
}
g2.drawImage(img,myx,myy,null) ;
With the assignment I do in setRGB I'd expect to get a 30x10 image with
10 horizontal stripes each one of a different colour as defined in my
table. But what I get is always a gray rectangle, and the same occurs if
I do img.setRGB(jx,jy,CONSTANT) ;
What am I doing wrong ?
As example this is the equivalent IDL codes (the Java R G B values will
be scaled in the range -128 to 127 instead of 0-255)
IDL> red=[0,255,255, 0, 0,255,255, 0,128, 0]
IDL> gre=[0,255, 0,255, 0,255, 0,255,128,128]
IDL> blu=[0,255, 0, 0,255, 0,255,255,128, 0]
IDL> tvlct,red,gre,blu
IDL> img=fltarr(30,10)
IDL> for i=0,29 do begin &
for j=0,9 do begin &
img(i,j)=j &
endfor &
endfor
IDL> tv,img
customary in astronomy, for us an image is an array img(x,y) of real
values in physical units, and colour is a mere way of representing them.
We usually scale the values img(x,y) according to user choice (linear
scale, log scale, histogram equalization) to normalized values i(x,y)
e.g. in the range [0,255] or [0.0,1.0] and use the latter as index into
a lookup table (LUT) of R G B values (the LUT is also chosen by the
user).
I have code which does that in Xlib, in Postscript (using [/Indexed
/DeviceRGB 255 {LUT} ] setcolorspace) and in RSI IDL (using
device,decomposed=0, and then setting tvlct,r,g,b where r,g,b are arrays
of R G B levels). I'd like to do it in java as well in the same way. I
am confused by the API documentation about images and color spaces which
seems overshooting for my needs.
I tried therefore to define a small LUT of 10 colours (as a test) with
one-byte depth and then to use this code to fill the image
IndexColorModel cm = new IndexColorModel(8,10, red,gre,blu) ;
BufferedImage img = new
BufferedImage(30,10,BufferedImage.TYPE_BYTE_INDEXED,cm)
for (int jx=0; jx<30 ; jx++) {
for (int jy=0; jy<10 ; jy++) {
img.setRGB(jx,jy,jy) ;
}
}
g2.drawImage(img,myx,myy,null) ;
With the assignment I do in setRGB I'd expect to get a 30x10 image with
10 horizontal stripes each one of a different colour as defined in my
table. But what I get is always a gray rectangle, and the same occurs if
I do img.setRGB(jx,jy,CONSTANT) ;
What am I doing wrong ?
As example this is the equivalent IDL codes (the Java R G B values will
be scaled in the range -128 to 127 instead of 0-255)
IDL> red=[0,255,255, 0, 0,255,255, 0,128, 0]
IDL> gre=[0,255, 0,255, 0,255, 0,255,128,128]
IDL> blu=[0,255, 0, 0,255, 0,255,255,128, 0]
IDL> tvlct,red,gre,blu
IDL> img=fltarr(30,10)
IDL> for i=0,29 do begin &
for j=0,9 do begin &
img(i,j)=j &
endfor &
endfor
IDL> tv,img