Android platform common dynamic effects


Now on the market all kinds of Android clients, in order to better user experience, we develop the upstream products and interactions will often be designed in the interface with a lot of dynamic effects. The traditional one-page static display page is no longer enough to meet the user’s aesthetic needs.


And the classification of the animation is also fancy, in terms of the timing of playback, there are click triggers, open page triggers, as well as can follow the finger interaction continuously triggered and so on. Sometimes some of the more coupled with the data of the animation or even need to write their own complex custom View, such as graphs, charts and other types.


I encountered most of the daily needs of the dynamic effect, or rely on the UI design at the same time to produce to provide, such as those short-term single display class dynamic effect, often more casual way to realize the format of the resource requirements are not too harsh. Generally there are the following programs.


In Android, frame animation is implemented through Drawable animation. You can create an AnimationDrawable object and then define a series of frames in XML, each frame can be a Drawable resource. Then start this animation in code.

 Here are two simple examples:


1. Create a file named frame_animation.xml in the res/drawable directory and define the frames for the animation:

<animation-list xmlns:android="http://schemas.android.com/apk/res/android"

android:oneshot="false">

<item android:drawable="@drawable/frame1" android:duration="100" />

<item android:drawable="@drawable/frame2" android:duration="100" />

<item android:drawable="@drawable/frame3" android:duration="100" />



</animation-list>


Here android:oneshot=”false” means the animation will be looped, if it is set to true then it will be played once. android:duration means the time to display each frame.


2. In your layout file (e.g. activity_main.xml), add an ImageView to show the animation:

<ImageView

android:id="@+id/imageView"

android:layout_width="wrap_content"

android:layout_height="wrap_content"

android:src="@drawable/frame_animation" />

 Then start the animation at the call:

ImageView imageView = (ImageView) findViewById(R.id.imageView);


final AnimationDrawable frameAnimation = (AnimationDrawable) imageView.getDrawable();



imageView.post(new Runnable() {

    @Override

    public void run() {

        frameAnimation.start();

    }

});


Take care to make sure that the size of each of your Drawable resources is consistent in order to keep the frames displayed correctly during the animation. This creates a simple frame animation that will automatically start looping when the Activity loads.

 PAG Animation


PAG is more performance friendly compared to the frame animation above.PAG is a complete animation workflow solution independently developed by Tencent. PAG was born in 2016, the initial reason was to solve the more complex video editing scene animation rendering problem, while covering the UI animation and live broadcast scene, in January 2022 in Github open source.


It’s fairly simple to use, just make sure you get the version from the github page, go to gradle and introduce the dependencies.

implementation 'com.tencent.tav:libpag:3.2.7.40'


Then place the pagView in the xml layout of our application, there are no additional properties to configure:

 <org.libpag.PAGView
        android:id="@+id/pagview"
        android:layout_width="@dimen/dp_1190"
        android:layout_height="@dimen/dp_1110"
        android:layout_marginTop="@dimen/dp_290"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintTop_toTopOf="parent" />


Finally, you can set its file source in the code, loop the way, and call the playback

  @Override
    protected void onCreate(Bundle savedInstanceState) {
        LogUtils.d("AirConditioner Start");
        super.onCreate(savedInstanceState);
        setContentView(R.layout.instance);

        PAGView pagView = findViewById(R.id.pagview); 
        PAGFile pf = PAGFile.Load(getContext().getAssets(), fileName);
        pagView.setComposition(pf);
        pagView.setRepeatCount(-1);
        pagView.play();
    }

 MP4 animation


The realization of this single dynamic program is the simplest, do not write demo demo, directly get mediaplayer instance, binding surfaceView or TextureView, and then fill in the file, play the video can be. Need to pay attention to the surfaceView video playback may have a black screen at the beginning of the problem, you can use a static map placeholder.

 Interactive animations


As the name suggests, this type of animation needs to be able to follow the user’s actions and change its expression in real time. The most common is the ceiling animation, for example, in a list of sliding process, will listen to the list of sliding distance, the interface on the top or side of the other View position and transparency, color and other dynamic settings. There are also systemui’s allapp interface page flip animation, the negative screen pull-down time, according to the distance to the desktop background to do Gaussian blur processing and so on.

 Kanzi Dynamic Effects


With the hand can interact with the kinetic effect, also have to talk about kanzi kinetic effect. The following description is taken from wikipedia and the official website:


Kanzi is an industry-leading 3D engine and UI development tool that supports high-efficiency immersive 3D effects, cross-system multi-screen connectivity and perfect integration with the Android ecosystem, and has become the preferred UI development tool and engine for smart cockpits of the world’s leading car manufacturers. The updated Kanzi architecture is deeply compatible with the Android operating system and ecosystem, and Kanzi provides powerful graphic design support based on any Android function, ensuring high-quality image effects.


For the integration of Kanzi dynamic use of the way, because they do not dock from scratch, I only follow the order of a stroke, there are wrong places welcome to correct. First of all, we integrate kanzi run required Runtime.aar, kanzi Java support library aar, resource files, resource list of txt, etc., also need to be written in the gradle uncompressible file types, in order to prevent the inability to load resources.


In use, we first declare it in the XML layout, and at the same time fill in the resource name in the asstes through the attribute, and bind it to the resource file:

 <com.rightware.kanzi.KanziTextureView
        android:id="@+id/tx_KanziSurfaceView"
        android:layout_width="@dimen/dp_2560"
        android:layout_height="@dimen/dp_1190"
        app:clearColor="@android:color/transparent"
        app:kzbPathList="climate.kzb"
        app:layout_constraintTop_toTopOf="parent"
        app:name="climate"
        app:startupPrefabUrl="kzb://climate/StartupPrefab"
        tools:ignore="MissingConstraints" />


In the code we need to set up the tool class for communication and add listeners in it to receive and interact with the uplink and downlink signals:


public interface AndroidNotifyListener {
    void notifyDataChanged(String name, String value);

    void dataSourceFinish();
}


AndroidUtils.setListeners(this);
AndroidUtils.removeListeners(this);
AndroidUtils.setValue(SourceData.RightMidMove_up2down, y);

 Unity Animation


Unity’s big name in the gaming world can be described as a thunderclap, I remember when I was a child to play a lot of games the opening screen interface that has a big Unity characters and icons.


The following introduction is from wikipedia and the official website:Unity is a platform for creating and operating real-time 3D interactive content. All creators, including game developers, fine art, architecture, automotive design, film and television, turn their ideas into reality with the help of Unity.The Unity platform offers a complete suite of software solutions for creating, operating, and realizing any real-time interactive 2D and 3D content on supported platforms, including cell phones, tablets, PCs, gaming consoles, augmented reality, and virtual reality devices. Unity is one of the world’s leading 3D engines, and the Unity engine provides full-stack support for 3D HMI. This means providing creative consulting, performance tuning, and project development solutions for the entire HMI workflow, from conceptual design to mass production deployment, to create amazing interactive experiences for in-vehicle infotainment systems and smart cockpits.


In fact, in the first version of our project is integrated Kanzi program, its performance is worse than Unity, the key is to promote the process of the project, the other engineers on the optimization of the style of animation can not meet the requirements of the evaluation department, and then updated iteration we replaced the Unity program. The focus of this article also lies in the use of Unity3D animation, the case for the car IVI system air conditioning app wind direction adjustment, the interaction logic is more complex than the above example, you need to follow the hand in real time, in the interaction within the hot zone need to constantly change the shape of the animation, and complete the bidirectional communication, to ensure that the animation and the consistency of the signal in the car.

 Two options for Unity integration


After all this kinetic effect padding, we’re finally getting to the point. This article will not delve into Unity’s rendering principles for the time being, but will only talk about integration and usage.

 Communications protocol development


The first step is not to create a project, but to specify in advance the communication protocol with Unity based on the HMI’s product interaction definition. What features are switched on/off and what properties need to be adjusted. The air conditioning app involves the opening and closing of several air vents, which can be distinguished by 0/1. There is also the adjustment of the direction of the air vents, you need to pass x,y coordinates between Android and Unity is used to communicate JSON strings, for the JSON string packing and unpacking through Google’s Gson and other three-party libraries to operate, quite simple.


Moreover, the communication link between the two parties is also related to the way Unity is integrated, like the first process isolation scheme we will talk about below, which is by integrating the full amount of Unity dependency packages and utilizing the JNI interfaces therein to communicate, while the Client/Server architecture is by using Android’s AIDL interfaces to communicate with a separate server-side process. The other interaction of the on-board signal link scheme involves the confidentiality of the project architecture, and will not be described here.


Process Isolation Program – UAAL (Render As Library)


In this way, Unity will package and export the rendering engine, resource files, and Android communication code into an aar, the size of which varies with the complexity of the animation, and will increase the size of the apk package of the integrator. And how many parties in the project want to use Unity animations, how many copies of the rendering engine are needed. This solution, in which the client is responsible for creating and destroying Unity controls and displaying and hiding them, is generally suitable for one-to-one and simple communication links, i.e., there may be only one module in the project that needs to use Unity animations. In the case of multiple modules that need to use Unity, the process isolation scheme is also more demanding on performance.


The upper level uses the control, UnityPlayer, which is a Unity custom FrameLayout with their own implementation of a series of addview, display, and rendering logic. The resource files are all in the dependency packages that Unity hits, and are not open to the public.

 Integration steps

 Process isolation is integrated as follows:


The first step is to place the Unity-provided aar in the libs folder and add its compilation reference in the gradle.

implementation files('libs/UnityAnimation_0321V4.aar')


The second step is to configure the NDK version required by Unity in the gradle, configure the abifilters, set which architecture’s dynamic libraries are to be packaged into the apk, for the car project only a fixed certain architecture is needed. There are also settings for uncompressed file types so that Unity can find resources to use without problems.

ndkVersion "23.1.7779620"

aaptOptions {
        noCompress = ['.tj3d', '.ress', '.resource', '.obb', '.bundle', '.tuanjieexp', 'global-metadata.so'] + tuanjieStreamingAssets.tokenize(', ')
        ignoreAssetsPattern = "!.svn:!.git:!.ds_store:!*.scc:!CVS:!thumbs.db:!picasa.ini:!*~"
    }

ndk{
    abiFilters 'arm64-v8a'
}



Note that we also need to add a String resource required by Unity to the project’s string.xml resource file, otherwise the Unity side will null pointer.

 <string name="game_view_content_description">Game view</string>


The third step is to change the page Activity that will display Unity animation to inherit from UnityPlayerActivity, the core display control of Unity, UnityPlayer, its creation, destruction, display and hiding are managed by this UnityPlayerActivity. Activity subclasses and then mUnityPlayer through addView added to their own root layout ViewGroup as a background can be, and you can continue to add other View controls on top of the xml.


The fourth step is to encapsulate the Unity communication tool class. Android can send a message to Unity directly through the UnityPlayer’s sendMessage static method, passing in the class name specified in the Unity communication protocol.

 UnityPlayer.UnitySendMessage(OBJ_NAME, METHOD_NAME, communicateMessage)


Unity using C# development, its message to the upper layer of Android is realized through reflection callback methods in the signal class, so we’d better make the signal management class into a singleton, and leave a method or member of its Unity, you can get an instance of our class, smooth reflection callback. What I’m using here is a Kotlin class declaration and exposing a public unityInstance member to the outside world. And this method, onReceiveMsgFromUnity, is a reflection call from Unity in which we parse the signal and pass it to the View, note that this method is not reflected in the main thread, so a wave of optimization is needed later.

object UnityMessageHelper {

    val unityInstance = this

    fun onReceiveMsgFromUnity(msg: String) {
        LogUtils.d(TAG, "onReceiveMsgFromUnity: $msg")
        if (listenerList.size > 0) {
            listenerList.forEach {
                it.onReceiveUnityMessage(msg)
            }
        }
    }
}


Optimization of signal class UnityMessageHelper


Since our target project is an air conditioning app, the frequency of callbacks when the user adjusts the wind direction is quite high, and the frequency of data uploaded by the bottom layer in the automatic wind sweeping mode is quite high, so it’s not suitable to go to the main thread to manipulate so much data, and we use a concatenated thread, together with the Default scheduler, to deal with this kind of CPU-intensive task. Two links, the user finger drag operation, Unity reflection callback thread itself are working thread, so we use custom interface callback to the View class, use MainScope.launch package layer, make sure it is to the main thread to update our UI. and automatic wind sweep mode from the domain controller to receive the coordinate value of the wind mouth clicked, we got the data and send a signal down to Unity to update the pointing position of the animation. This can be done using a concurrent context switch, withContext(Dispatcher.Default), which cuts it into the worker thread and sends it to Unity.

 Problems encountered


The base class Activity in the aar given by Unity is applicable to most of the common applications, but the positioning of my air-conditioning app here is a temporary hovering panel, in fact, there is no Activity in my project, and I use WindowManager to add a high-level window in the form of View to display the interface.


We can’t use their UnityPlayerActivity, we can only use the native Raw UnityPlayer, and manage its creation, destruction, resume and pause by ourselves, what we need to pay attention to is that the creation of UnityPlayer needs to pass a Context, and the application doesn’t have a Activity type of Context, so we can only use non-Activity type of Context, and in practice, we found that the UnityPlayer instance must be available to our application to get the handle of the window before it can be successfully created, otherwise it will report an error.


So the correct order of creation and initialization is to first use WindowManager to add an xml layout to inflate the ViewGroup, and after its onAttachToWindow method callback, then create an instance of UnityPlayer and add it to the layout of this ViewGroup, calling its resume method.

    public void initUnity() {
        if (mUnityPlayer == null) {
            LogUtils.i(TAG, "initUnity");
            unityInitView = (LinearLayout) LayoutInflater.from(mContext).inflate(R.layout.layout_unity_init, null);
            unityInitView.addOnAttachStateChangeListener(new View.OnAttachStateChangeListener() {
                @Override
                public void onViewAttachedToWindow(@NonNull View v) {
                    LogUtils.w(TAG, "unityInitView onViewAttachedToWindow");
                    mUnityPlayer = new UnityPlayer(mContext);
                    unityInitView.addView(mUnityPlayer);
                    mUnityPlayer.requestFocus();
                    mUnityPlayer.resume();
                    mUnityPlayer.windowFocusChanged(true);
                }

                @Override
                public void onViewDetachedFromWindow(@NonNull View v) {
                    LogUtils.w(TAG, "unityInitView onViewDetachedFromWindow");                      airConditionerView.addUnityToAirView();
                }
            });

           mWindowManager.addView(unityInitView, initUnityWindowParams());
        }
    }


Note that the UnityPlayer added in this way has an unsolvable black screen problem, because Unity’s rendering takes at least 4 or 5 seconds to load, during which time we can only set a static background image to overlay on it in a higher-level View. After Unity finishes loading and sends out the ready callback, we remove the static image and show the Unity animation of the interface. This is one of the tricky aspects of the process isolation solution. My solution is to add a View to the screen at boot time to initialize and load Unity, then remove the UnityPlayer from the screen when it’s loaded, and add it back to the actual window where we want to display it, so that when we open the interface, we can omit the loading time and reduce the page freeze time a little bit.


Single Process – URAS (Render As Service)


Unity Rendering as Service (URAS) rendering solution is unique to the Unity engine, eliminating the need to integrate multiple Unity 3D players into multiple Android apps. Instead, it runs in the background and can be called directly by the front-end apps, saving system resources and making it more suitable for multi-application animation design in a single shot.

 Advantages over process isolation programs


This program is an upgrade from the UAAL program, so some of the preliminary work is duplicated and will not be repeated.


It is to package several Unity engines to be displayed to the same Server to unify the control. In fact, the server-side apk packaging is also to get the server-side AAR provided by Unity into an empty project, the internal logic is also hidden in the AAR. The server-side and client-side communication is realized using the familiar AIDL interface. And this server we need to set up as persistent application, so that it can boot up, automatically perform rendering and other work, other applications have to display the demand can be opened in seconds, and a long time does not display will not recycle its own resources, the client’s black screen problem can also be solved.


Compared to the UAAL solution, the client needs to be integrated with a very small Client.aar, which is advantageous for the size control of the client apk.

 Integration and Usage


We just need to introduce the client aar in the gradle, add the remote UnityView to our own layout after the gradle sync, configure the display parameter (used to differentiate which engine’s content is for the server), and specify the server’s package name. There are two types of view, SurfaceView and TextureView, but my application interface is a hover window, designed with fade-in/fade-out animation, and SurfaceView can’t set alpha animation linearly, so I chose TextureView as the container.

<com.unity3d.renderservice.client.TuanjieView
        android:id="@+id/unityview"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        app:tuanjieDisplay="2"
        app:tuanjieServicePkgName="com.tuanjie.renderservice"
        app:tuanjieViewType="TextureView" />


The remaining code logic is simply the startup of the server-side Service, the callback to add a service connection, and the message callback. Since the server-side is a common engine for several Clients, there is no need to even handle resume and pause, as these two operations will take effect for all Clients. We just need to make sure to start the service with the correct display, and panels backed into the background can be controlled to show and hide using setVisbility. In addition to this, our communication tool class, UnityMessageHelper needs to implement two interfaces, a service connection status interface and a message callback interface for business data, the code is as follows:

object UnityMessageHelper : TuanjieRenderService.Callback, SendMessageCallback {

    fun initUnityService() {
        LogUtils.i(TAG, "initUnityService")
        unityRenderService = 
            TuanjieRenderService.init(appContext,TUANJIE_PACKAGENAME).apply {
                enableAutoReconnect = true
                addCallback(this@UnityMessageHelper)
                addSendMessageCallback(this@UnityMessageHelper)
                ensureStarted()
        }
    }


    override fun onServiceConnected() {
        LogUtils.w(TAG, "onUnityRenderServiceConnected")
    }
   
    override fun onServiceDisconnected() {
        LogUtils.w(TAG, "onUnityRenderServiceDisConnected")
        messageScope.launch {
            delay(500L)
            initUnityService()
            unityRenderService.resume()
        }
    }
   
    override fun onServiceStartRenderView(p0: Int) {
        LogUtils.i(TAG, "onServiceStartRenderView")
    }

    override fun onClientRecvMessage(message: String?) = null

   
    override fun onClientRecvMessageWithNoRet(msg: String?) {
      

    }

}


Also Android’s method of signaling Unity has been switched from a static method of UnityPlayer to a method call of this service instance:

  unityRenderService.c2sSendMessage(OBJ_NAME, METHOD_NAME, communicateMessage)


Receiving callback messages from Unity has also been replaced with the callback method set in addSendMessageCallback.


It can be said that the URAS solution is superior to the UAAL solution in terms of performance and ease of client integration due to its unified control and one-to-many characteristics. From the architectural level, it can link more animation modules to realize the silky smooth transition from one mirror to the end.


The above is this article for Android commonly used several kinds of dynamic effects of the elaboration and two common Unity integration program records. Follow-up in addition to the most superficial use of the level, you can further explore its principles, and even their own use of Unity development tools, their own experience of the production and access to a dynamic effect, to do the whole chain know your enemy, so that you can more efficiently integrate Unity for their own applications to add flowers.

By hbb

Leave a Reply

Your email address will not be published. Required fields are marked *