I’m an Android novice coming from web development, so I apologize if this question has an obvious answer; Google has not been able to provide me with one. I have an looping animation that I would like to turn into an Android live wallpaper that I can place on the app store.
I found a helpful tutorial on creating a basic live wallpaper here: http://www.techrepublic.com/blog/android-app-builder/a-bare-bones-live-wallpaper-template-for-android/. After building my own version of that template I thought I could replace the “draw” function with one that created a media player and played an mpg to the screen, but I’ve had no luck finding a tutorial to do this.
Then I ran across this: Android video as a live wallpaper, which indicates that playing an mpg may be far more difficult and problematic than I originally assumed. This seems to provide a solution: How to play video as live wallpaper android? However, it requires the use of an external API that specifically says it will increase complexity by a lot, implying it should only be used by experienced developers.
So I’m going back to the basics here and asking: How would a novice developer create a live wallpaper from an animation? Using a video format with the native media player? Drawing each frame to the screen as a png? Using a freely available api?
Since I posted I have done some research and developed two different apps, but both are unsatisfactory.
In the first I decode each frame (stored as a PNG file) and store those in a bitmap array. Then I loop every 50ms and draw each frame directly out to the canvas from the array. This method works relatively smoothly, but memory usage is atrocious. Since each PNG is decoded and then stored as a bitmap, I run out of memory with 1-3 seconds of video (depending on the device).
In the second I read each frame into a bytearray as a PNG file. This allows me to store them in RAM for quick access, but by leaving them compressed memory usage is minimized. During the loop I decompress each one on the fly from the bytearray before drawing it out to the canvas. This method uses less than half the memory of the first. However, the animation is jumpy, presumably from all the GC required.
Would openGL or playing an mpeg through the media player offer better performance than these? Or is there a way to optimize one of the above methods?
I have further refined method 2 above. I was previously setting a 40ms delay at the end of the drawing function for it to call itself again. However, since the decode and draw actions took so much time, and it was unpredictable, it caused well over 40ms delays between frames. I’ve since moved the 40ms postDelayed call to the beginning of the procedure before the decode and draw processing. This has led to far smoother playback, though at the cost of about 50% processor usage while visible. However, this has so far only been tested on a high end ICS device; I plan to give it a go on a Droid X in the next day or two to see what happens.
I’m still not convinced this is the best method and would appreciate input. I have a nagging feeling that I’m doing it way wrong.
Playback was relatively smooth on a high end device, but not my Droid X. I thought this might be due to all the extrapolation that needed to occur when displaying an image at a resolution different than its native resolution. So I built code to save each frame out as a jpg at the exact resolution of the phone. This lead to oom errors on some devices, so I switched to 1/2 resolution. This worked, though memory usage was still high.
I attempted to resolve further by using a single background frame for the top 2/3 of the animation (which is mostly static), and then only saved out frames for the bottom third. However, playback is now clocking in much more slowly, around 10 FPS, even on a high end device. This is presumably because each frame needs redrawn in its entirety, and now there are two bitmaps on the canvas. Though memory usage is very good with this method.
There was a suggestion for me to enable hardware acceleration, which I understand can’t be done when posting to a surfaceView in a live wallpaper like this. As I searched around I realized that a graphics engine may be a good direction to go and tried AndEngine. I converted my frames to a single giant jpg (15,000 x 3080 pixels) and loaded them into a sprite using this code:
waterTexture = new BitmapTextureAtlas(this.getTextureManager(), 15000, 3040, TextureOptions.BILINEAR); waterRegion = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(waterTexture, this.getAssets(), "wateranimation.jpg", 0, 0, 25, 16); waterTexture.load();
I had hoped AndEngine would have some way of intelligently loading this, but instead I just got an out of memory error with my wallpaper requesting just under 200 MB of RAM.
This project has now consumed well over 50 hours for what I feel should have been one line of code: “backgroundHolder = movie.mpg”. I have asked several questions both on SO and other websites about how to do it with relatively few responses. This makes me question if what I’m trying to do is even possible, despite the fact that I was watching smooth 800×600 movies a decade ago on a PC with a fraction of the processing power and memory of even my Droid X. I know the hardware isn’t the problem. Is it my poor coding, or limitations in Java? Is this possible at all? Just finding out that it isn’t possible would at least allow me to stop wasting time on it.
I’m going to attempt to add a bounty to this question.
Related questions I have started regarding this project: