Android L dropping 4096 textures. what a hell, Google? I'm got a email about that today.
That mean I'm need to remove 1080p+ supports for both Greedy Mouse and Karma Mima (backgrounds mostly).
I'm a ll add option for max surface when Lollipop M is detected.
Heck, why would they do that? Are they messing on a winning team?
Surely as devices are getting faster, with larger RAM and storage capabilities you'd expect increased texture sizes would be the acceptable.
Strange.
:/
Actually those limits are defined in drivers, I dont think GPU vendors will remove support for 4k textures, just cause Google allows them. Usually drivers implement more then just the pure limits. We will see what happens in reality.
BTW space, why drop details, cant you simply tile your background by splitting your 4k texture into 4 2k ones?
I'm can't sllit to two textures without major works and major slowdown. Im do can do semi Retina support using a extra texture for most important objects (balls, cheese etc).
ouya and such devices did also not support 4096, but 1080p was too slow anyway and reverted to 720p.
Since this is important I'm thinks TEXTURESIZE would been added for PLATFORMINFO$ command.
Hi Developer,
In preparation for the release of Android M, we've been rigorously testing the most popular applications on Google Play, including Greedy Mouse (com.spacefractal.greedymouse). During testing, we uncovered a bug specific to your application running on the upcoming M release. Here are the steps to reproduce:
Install Greedy Mouse app from https://play.google.com/store/apps/details?id=com.spacefractal.greedymouse
Open Greedy Mouse app
EXPECTED RESULTS:
It works
OBSERVED RESULTS:
It crashes
Devices Tested: Nexus 5, Nexus Player (Fugu)
This application doesn't fail with Android L but it fails with Android M MPZ44Q & MPZ79M
Reason for failure:
==============
In Android M, Texture support reverts from 4096 to 2048 textures and your application doesn't support 2048 textures and app fails to launch.
This bug can be reproduced using the M Developer Preview build. We wanted to let you know so you could take a look and address the issue. We're unable to provide direct assistance with implementation details, but if you uncover any issues in the M Developer Preview, please let us know by filing a bug in the issue tracker, rather than replying to this message.
Thanks!
The M Developer Preview team
Hmm thats a clear fail by Google. :rant:
Checking the maximum texture size before loading generally is not a bad idea. Best way to do it is:
int maxsize;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxsize);
BTW OpenGL ES 2.0 requires GL_MAX_TEXTURE_SIZE to be at least 64x64 pixels, while OpenGL ES 3.0 requires a minimum of at least 2048x2048. So Google limits you to the lowest possible value... I would guess this is because of the support for cheap devises with cheap GPUs that have small RAM and low bandwidth. So basically they are nerving high spec devises in favour of cheap ones. Will be fun when there are devices with higher resolutions than 2048 (what about TV sticks on a 4k display?), they can not even load a fullscreen texture, or do easy screen capture... I expect Google to drop this limitation (if it ever gets implemented in final code), because its just plain stupid.
im gonna thinks im have a work around how to use a splitted 2048 textures using Tiled as a tool. Its will require 4 texture swaps, so im have no idea its would been slowdown or such.
But Currectly im got retina shadow and decorations to work again from the first layer texture.
Could be Google is forcing app devs to produce more efficient apps, speed wise, as opposed to graphical fidelity. On the other hand kudos to Google for trying to keep older hardware in the game, almost the exact opposite of Apple in this respect.
also... Apple hardware is faster than Android and games do im experimence with. My iPhone 6 and iPad 3 can run the game with 60fps with full details, while most Android devices struggle to get it running with 60fps, mostly when display resolution is high. Im did recently added a unlocked 60fps feature in the game.
Howover im got 95% retina fixed in Greedy Mouse using 4 tiles textures, and only swap them once (thanks Tiled, so this work was a lots easier than excepted). Im still not tested on the device, but that is later today.
So dont use 4096 textures, if you can avoid them, or a least you also adding a 2048 texture version as well, if you want to run on so may devices as possible.
Im did used glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxsize);, but for some reasons its did fails.
Of cause you 1st need to import the function, before you can use it. If we need to do it manually, this is how we do it:
// somewhere global, preferably in glbasic namespace, GLBasic probably does gl function import somewhere, that would be the place for this:
typedef unsigned int GLenum;
typedef int GLint;
#define GL_MAX_TEXTURE_SIZE 0x0D33
extern "C" {
#ifdef __WIN32__
#ifndef APIENTRY
#define APIENTRY __stdcall
#endif
#else
#ifndef APIENTRY
#define APIENTRY
#endif
#endif
void APIENTRY glGetIntegerv (GLenum pname, GLint* params);
}
Just copied together without trying, let me know how this works for you
I'm import as well of course. I'm do that alots on AE and ios.
Strange it works for me (test run in windows), here is my test program:
maxtex()
FUNCTION foo:
ENDFUNCTION
INLINE
typedef unsigned int GLenum;
typedef int GLint;
#define GL_MAX_TEXTURE_SIZE 0x0D33
extern "C" {
#ifdef __WIN32__
#ifndef APIENTRY
#define APIENTRY __stdcall
#endif
#else
#ifndef APIENTRY
#define APIENTRY
#endif
#endif
void APIENTRY glGetIntegerv (GLenum pname, GLint* params);
}
ENDINLINE
FUNCTION maxtex:
INLINE
int maxsize;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxsize);
DEBUG ( maxsize );
ENDINLINE
ENDFUNCTION
Debug outputs 16384 for me, which seems to be ok. What problems do you get?
IM just got surpriced, since the 2048 detection diddent work, that it. But this thread was more to sure people uses to max 2048px in size or doing a correct fallback to 2048 as least. So im thinks we do need a comman without inline, here etc from PLATFORMINFO$() is the right place.
im used this code:
GLOBAL GL_MAX_TEXTURE_SIZE = 0x0D33
GLOBAL SDL_DOUBLEBUF = 0x40000000
INLINE
#ifdef CENTERLINE_CLPP
#define signed
#endif
#define OGL ::
typedef unsigned int GLenum;
typedef unsigned char GLboolean;
typedef unsigned int GLbitfield;
typedef void GLvoid;
typedef signed char GLbyte;
typedef short GLshort;
typedef int GLint;
typedef unsigned char GLubyte;
typedef unsigned short GLushort;
typedef unsigned int GLuint;
typedef int GLsizei;
typedef float GLfloat;
typedef float GLclampf;
typedef double GLdouble;
typedef double GLclampd;
ENDINLINE
INLINE
} extern "C" { void __stdcall glGetIntegerv( GLenum pname , GLint *params );; }; namespace __GLBASIC__ {
ENDINLINE
FUNCTION glGetIntegerv: pname, BYREF params
INLINE
GLint i;
OGL glGetIntegerv(pname, &i);
params=i;
ENDINLINE
ENDFUNCTION
INLINE
} extern "C" { void __stdcall SDL_GL_SetAttribute(int, int);; }; namespace __GLBASIC__ {
ENDINLINE
FUNCTION enablevSync:
?IFDEF MACOSX
INLINE
SDL_GL_SetAttribute(SDL_DOUBLEBUF, 2);
SDL_GL_SetAttribute(16, 1);
ENDINLINE
?ENDIF
ENDFUNCTION
FUNCTION CheckTexturesize:
LOCAL result
glGetIntegerv(GL_MAX_TEXTURE_SIZE, result)
DEPRINT(result)
RETURN result
ENDFUNCTION
Also im do seen in the log file, its does output a texture size directly in the log. Im see can get that routed throught.
Hmm Im not sure what that post was trying to say, did you get it to work or not, in the end? If it works ok, then of cause it is easy to implement it into PLATFORMINFO.
its property did not work, since im got a report like this. Anyway im have removed the 4096 texture in Greedy Mouse. Im will wait to implement it to PLATFORMINFO from that SDL info im seen in the log, when my and Gernot is sync about source code again.