I had a new feature to edit light source for mine sweeper app. On 2020 April 26th, I started to develop to have a procedure to move light source location.
My first plan is followings.
- Select light source editing menu item.
- Display light source icon on game board.
- You can tap game board to move light source.
- Light source and its icon move to the place you tapped.
- Update scene with moved light source.
The app is rendered with webgl which displays 3d image. Internal all objects in game are 3 dimension information. I had to write procedure to calculate 3d coordinate from 2d coordinate. Because the display is 2d coordinate system. You get only a x-y coordinate from system. I need to get z coordinate which are related x-y coordinate from system. In computer programs, we call “z-depth” or “depth” for a x-y coordinate. Most of 3d graphics libraries are keeping z-depth when render 3d objects into display. To get z-depth, you just read a z-depth from an accumulated z-depth information. I read technical documents for the way to get z-depth. In webgl(ver 1.0), I did not get z-depth directly. I could get color value related x-y coordinate in Webgl(ver 1.0). I read some articles further. I could read depth as color value. The way to read depth is following.
- prepare offscreen buffer which is never show user.
- prepare shader program to convert depth value to color value and write the value into offscreen buffer.
- Render 3d object into offscreen buffer.
- Read a color value from offscreen buffer.
- Convert the color value to depth
In webgl(ver 1.0), You can read color value as 16 bits value. The each components (rgba) are 4 bits. In shader language specification the depth is represented 24 or more bits floating value. It would loose information about depth. I tried to read depth value from offscreen buffer. Some values I got, are greater than 1.0. It was incorrect value. Because depth must be in range from 0.0 to 1.0. I checked my code about 4 hours. But I had no bugs to generate 1.0 over z-depth. I write test code to write 0.5 value into offscreen buffer. I got values in range greater than 0.4 and less than 0.5. I got also some values in range greater than 0.5 and less than 0.6. but I did not get exactly 0.5 value. I calculate average of values. It was 0.5. I thought the color could not represent exact 0.5 value for lower bits than internal depth bits. I checked shader program to covert depth to color. I got the bug to store value with 8 bits floating value. Each components could have 4 bits. If the value grater than 6/8, the component would 1.0. It was the reason why I get the depth greater than 1.0. I fixed the bug. I got depth less than 1.0. It seems correct. Then I applied inverse matrix to the depth, I had a light source z coordinate. I noticed the original z coordinate from rendered 2d image is less than light source coordinate I put. The difference is about 0.3. I thought that the difference came from 16 bit floating value. This difference is not allowable for me. I gave up to read depth from offscreen buffer.
I stepped up second plan. I wrote code to read depth with projection matrix and invert projection matrix. It was to project 3d rectangle which is restricted bound you can move the light source. Each projected coordinate has z value. The rectangle was composed by two triangles. You can get z-depth according to followings.
- find a triangle composing rectangle which contains x-y coordinate.
- calculate interpolation ratio the coordinate in triangle.
- calculate z-depth by interpolation of each triangle z coordinates.
In the second plan, I was satisfied getting nice result to calculate z depth.