Bug 1390

: GLPixelBuffer.GLPixelAttributes::convert(GL, int, boolean) failed on unsupported GL data format/type
: [JogAmp] Jogl : Sven Gothel <sgothel>
: opengl: Sven Gothel <sgothel>
: RESOLVED FIXED    
: normal    
: P4    
: 2.4.0   
: All   
: all   
Type: DEFECT SCM Refs:
90760ac8eebe7431ac7392e4ebf3f9009e63cd72
Workaround: ---
Bug : 817    
Bug :    

Description Sven Gothel 2019-09-05 05:23:46 CEST
GLPixelBuffer.GLPixelAttributes::convert(GL, int, boolean) failed on unsupported GL data format/type

On Mesa/AMD for GLPBuffer chosen GLCaps used rgba 10/10/10/2
and the GLContext set default values:
GL_IMPLEMENTATION_COLOR_READ_FORMAT: 0x1908 GL_RGBA
GL_IMPLEMENTATION_COLOR_READ_TYPE: 0x8368 GL_UNSIGNED_INT_2_10_10_10_REV

GLPixelBuffer.GLPixelAttributes::getPixelFormat(int format, int type)
currently does not handle the type GL_UNSIGNED_INT_2_10_10_10_REV
and hence returned a null PixelFormat.

Therefor the ctor GLPixelAttributes failed and threw the exception:
"Caught GLException: Could not find PixelFormat for format and/or type:
 PixelAttributes[fmt 0x1908, type 0x8368, null]"

Solution is to have the GLContext default values pre-validated in the convert(..) method and to use default GL_RGBA and GL_UNSIGNED_BYTE fallback values if not supported. This is most important to be future proof.

Later we may shall add these 32bit coding 2+10+10+10 and its reverse.
Comment 1 Sven Gothel 2019-09-05 05:39:36 CEST
commit 90760ac8eebe7431ac7392e4ebf3f9009e63cd72