This can be a visitor put up from amo Engineer, Cyril Mottier, on how they’ve leveraged Android to create magic of their Android app.
At amo, we’re redefining what it means to construct social functions. Our mission is to create a brand new sort of social firm, one which prioritizes high-quality, thoughtfully designed cell experiences. One in all our flagship functions, Bump, places your pals on the map — whether or not you’re checking in in your crew or making strikes to satisfy up.
Our app leverages multiplatform applied sciences for its basis. On the core lies a shared Rust-based library that powers all of our iOS and Android apps. This library, managed by our backend engineers, is accountable for persistence and networking. The library exposes its APIs as Kotlin Circulate. Along with making the whole lot reactive and realtime-enabled by default, it integrates effortlessly with Jetpack Compose, the expertise we use to construct our UI. This structure ensures a constant and high-performance expertise throughout platforms. It additionally permits cell engineers to spend extra time on the consumer expertise the place they’ll deal with crafting modern and immersive consumer interactions.
On this put up, we’ll discover how we leverage the Android SDK, Jetpack Compose, the Kotlin programming language, and Google Play Companies to construct distinctive, pleasant experiences in Bump. Our aim is to interrupt psychological boundaries and present what’s really attainable on Android — typically in a only a few strains of code. We need to encourage engineers to assume past typical UI paradigms and discover new methods to create magical moments for customers. By the tip of this text, you’ll have a deeper understanding of the best way to harness Android’s capabilities to construct experiences that really feel like magic.
At amo, we ship, study, and iterate quickly throughout our function set. That signifies that the design of a number of the options highlighted on this article has already modified, or will accomplish that within the coming weeks and months.
Nice touch-based UX isn’t nearly flashy visuals. It’s about delivering significant suggestions by graphics, haptics, sounds, and extra. Mentioned in a different way, it’s about designing for all senses, not only for the eyes. We take this very significantly when designing functions and at all times take all of those potential dimensions into consideration.
One instance is our in-app notification heart. The notification heart is a visible entry level accessible from anyplace within the app which exhibits all your notifications from the whole amo suite of apps. It may be moved anyplace on the display. Its fashion additionally modifications often due to some in-residence or exterior artists. However styling doesn’t cease on the visible stage, we additionally fashion it on the audio stage: when it’s dragged round, a brief and repeating sound is performed.
To make it enjoyable and joyful, we pushed this even additional to let the consumer be a DJ. The quantity, the velocity and the pitch of the audio change relying on the place the consumer drags it. It’s a “be your individual DJ” second. The implementation of this expertise will be break up in two components. The primary half offers with the audio and the second half handles the rendering of the entry level and its interactions (dragging, tapping, and so on.).
Let’s first dive into the code dealing with the audio. It consists of a Composable requiring an URL pointing to the music, a flag indicating whether or not it ought to play or not (true solely when dragging) and a two-dimensional offset: X axis controls the quantity, Y axis controls the playback velocity & pitch.
@Composable
enjoyable GalaxyGateAccessPointMusicPlayer(
musicUrl: String,
isActive: Boolean,
offset: Offset,
) {
val audioPlayer = rememberAudioPlayer(
uri = Uri.parse(musicUrl),
)
LaunchedEffect(audioPlayer, isActive) {
if (isActive) {
audioPlayer.play(isLooped = true)
} else {
audioPlayer.pause()
}
}SideEffect {
audioPlayer.setSpeedPitch(
velocity = 0.75f + offset.y * 0.5f,
pitch = offset.x + 0.5f
)
audioPlayer.setVolume(
(1.0f - ((offset.x - 0.5f) * 2f).coerceIn(0f, 1f)),
(1.0f - ((0.5f - offset.x) * 2f).coerceIn(0f, 1f)),
)
}
}
@Composable
enjoyable rememberAudioPlayer(
uri: Uri,
): AudioPlayer {
val context = LocalContext.present
val lifecycle = LocalLifecycleOwner.present.lifecycle
return bear in mind(context, lifecycle, uri) {
DefaultAudioPlayer(
context = context,
lifecycle = lifecycle,
uri = uri,
)
}
}
DefaultAudioPlayer
is simply an in-house wrapper round ExoPlayer
supplied by Jetpack Media3 that offers with initialization, lifecycle administration, fading when beginning/stopping music, and so on. It exposes 2 strategies setSpeedPitch
and setVolume
delegating to the underlying ExoPlayer
.
By combining gesture readings with audio pitch, velocity and quantity, we added delight and shock when customers didn’t anticipate it.
We named our software “Bump” as a nod to its core function: individuals shut to one another can “bump” their telephones collectively. If they aren’t registered as mates on the app, bumping will mechanically ship a buddy request. And if they’re, a mesmerizing animation triggers. We additionally notify mutual mates that they “bumped” and they need to be a part of.
This Bump function is central to the app’s expertise. It stands out in its interplay, performance, and the distinctive worth it offers. To specific its significance, we needed the function to have a particular visible attraction. Right here’s a glimpse of the way it presently seems to be within the app:
There’s a lot occurring on this video however it may be summarized to three animations: the “wave” animation when the system detects a neighborhood bump/shake, the animation displaying the 2 mates bumping and lastly a “ring pulsing” animation to complete. Whereas the second animation is apparent Compose, the 2 others are customized. Creating such customized results concerned venturing into what is usually thought of “unknown territory” in Android growth: customized shaders. Whereas daunting at first, it’s really fairly accessible and unlocks immense artistic potential for really distinctive experiences.
Merely put, shaders are extremely parallelizable code segments. Every shader runs as soon as per pixel per body. This may sound intense, however that is exactly the place GPUs excel. In Android 13, shaders have been built-in as first-class residents with AGSL shaders and RuntimeShader
for Views and Compose.
Since our app requires a minimal of API 30 (Android 11), we opted for a extra conventional method utilizing a customized OpenGL renderer.
We extract a Bitmap
of the view we need to apply the impact to, move it to the OpenGL renderer, and run the shader. Whereas this technique ensures retro-compatibility, its important downside is that it operates on a snapshot of the view hierarchy all through the animation. Consequently, any modifications occurring on the view through the animation aren’t mirrored on display till the animation concludes. Eager observers may discover a slight glitch on the finish of the animation when the screenshot is eliminated, and regular rendering resumes.
In our apps, profile footage are a bit totally different. As a substitute of static photos, you document a reside profile image, a clear, animated boomerang-like cutout. This method feels extra private, because it actually brings your pals to life on-screen. You see them smiling or making faces, fairly than simply viewing curated or filtered photographs from their digital camera roll.
From a product perspective, this function includes two key phases: the recording and the rendering. Earlier than diving into these particular areas, let’s focus on the format we use for knowledge transport between cell gadgets (Android & iOS) and the server. To optimize bandwidth and decoding time, we selected the H.265 HEVC format in an MP4 container and carry out the face detection on system. Most fashionable gadgets have {hardware} decoders, making decoding extremely quick. Since cross-platform movies with transparency aren’t extensively supported or optimized, we developed a customized in-house answer. Our movies encompass two “planes”:
- The unique video on prime
- A masks video on the backside
We haven’t but optimized this course of. At the moment, we don’t pre-apply the masks to the highest airplane. Doing so might cut back the ultimate encoded video measurement by changing the unique background with a plain colour.
This format is pretty efficient. For example, the video above is simply 64KB. As soon as we aligned all cell platforms on the format for our animated profile footage, we started implementing it.
Recording a Reside Profile Image
Step one is capturing the video, which is dealt with by Jetpack CameraX. To supply customers with visible suggestions, we additionally make the most of ML Equipment Face Detection. Initially, we tried to map detected facial expressions (similar to eyes closed or smiling) to a 3D mannequin rendered with Filament. Nevertheless, attaining real-time efficiency proved too difficult for the timeframe we had. We as a substitute determined to detect the face contour and to maneuver a default avatar picture on the display accordingly.
As soon as the recording is full, Jetpack CameraX offers a video file containing the recorded sequence. This marks the start of the second step. The video is decoded body by body, and every body is processed utilizing ML Equipment Selfie Segmentation. This API computes the face contour from the enter picture (our frames) and produces an output masks of the identical measurement. Subsequent, a composite picture is generated, with the unique video body on prime and the masks body on the backside. These composite frames are then fed into an H.265 video encoder. As soon as all frames are processed, the video meets the specs described earlier and is able to be despatched to our servers.
Whereas the method may very well be improved with higher interframe selfie segmentation, utilization of depth sensors, or extra superior AI strategies, it performs properly and has been efficiently operating in manufacturing for over a 12 months.
Rendering Your Buddies on the Map
Taking part in again animated profile footage introduced one other problem. The principle problem arose from what appeared like a easy product requirement: displaying 10+ real-time transferring profile footage concurrently on the display, animating in a back-and-forth loop (much like boomerang movies). Video decoders, particularly {hardware} ones, excel at decoding movies ahead. Nevertheless, they wrestle with reverse playback. Moreover, decoding is computationally intensive. Whereas decoding a single video is manageable, decoding 10+ movies in parallel isn’t. Our requirement was akin to wanting to observe 10+ motion pictures concurrently in your favourite streaming app, all in reverse mode. That is an unusual and distinctive use case.
We overcame this problem by buying and selling computational wants for elevated reminiscence consumption. As a substitute of repeatedly decoding video, we opted to retailer all frames of the animation in reminiscence. The video is a 30fps, 2.5-second video with a decision of 256×320 pixels and transparency. This leads to a reminiscence consumption of roughly 24MB per video. A queue-based system dealing with decoding requests sequentially can handle this effectively. For every request, we:
- Decode the video body by body utilizing Jetpack Media3 Transformer APIs
- For every body:
- Apply the decrease a part of the video as a masks to the higher half.
- Append the generated Bitmap to the record of frames.
Upon finishing this course of, we acquire a Checklist<Bitmap>
containing all of the ordered, reworked (mask-applied) frames of the video. To animate the profile image in a boomerang method, we merely run a linear, infinite transition. This transition begins from the primary body, proceeds to the final body, after which returns to the primary body, repeating this cycle indefinitely.
@Immutable
class MovingCutout(
val length: Int,
val bitmaps: Checklist<ImageBitmap>,
) : Cutout@Composable
enjoyable rememberMovingCutoutPainter(cutout: MovingCutout): Painter {
val state = rememberUpdatedState(newValue = cutout)
val infiniteTransition = rememberInfiniteTransition(label = "MovingCutoutTransition")
val currentBitmap by infiniteTransition.animateValue(
initialValue = cutout.bitmaps.first(),
targetValue = cutout.bitmaps.final(),
typeConverter = state.VectorConverter,
animationSpec = infiniteRepeatable(
animation = tween(cutout.length, easing = LinearEasing),
repeatMode = RepeatMode.Reverse
),
label = "MovingCutoutFrame"
)
return bear in mind(cutout) {
// A customized BitmapPainter implementation to permit delegation when getting
// 1. Intrinsic measurement
// 2. Present Bitmap
CallbackBitmapPainter(
getIntrinsicSize = {
with(cutout.bitmaps[0]) { Measurement(width.toFloat(), top.toFloat()) }
},
getImageBitmap = { currentBitmap }
)
}
}
personal val State<MovingCutout>.VectorConverter: TwoWayConverter<ImageBitmap, AnimationVector1D>
get() = TwoWayConverter(
convertToVector = { AnimationVector1D(worth.bitmaps.indexOf(it).toFloat()) },
convertFromVector = { worth.bitmaps[it.value.roundToInt()] }
)
As a map-based social app, Bump depends closely on the Google Maps Android SDK. Whereas the framework offers default interactions, we needed to push the boundaries of what’s attainable. Particularly, customers need to zoom out and in shortly. Though Google Maps gives pinch-to-zoom and double-tap gestures, these have limitations. Pinch-to-zoom requires two fingers, and double-tap doesn’t cowl the complete zoom vary.
For a greater consumer expertise, we’ve added our personal gestures. One significantly helpful function is edge zoom, which permits speedy zooming out and in utilizing a single finger. Merely swipe up or down from the left or proper fringe of the display. Swiping right down to the underside zooms out utterly, whereas swiping as much as the highest zooms in absolutely.
Like Google Maps gestures, there are not any visible cues for this function, but it surely’s acceptable for an influence gesture. We offer visible and haptic suggestions to assist customers bear in mind it. At the moment, that is achieved with a glue-like impact that follows the finger, as proven beneath:
Implementing this function includes two duties: detecting edge zoom gestures and rendering the visible impact. Due to Jetpack Compose’s versatility, this may be completed in only a few strains of code. We use the draggable2D
Modifier to detect drags, which triggers an onDragUpdate
callback to replace the Google Maps digital camera and triggers a recomposition by updating some extent variable.
@Composable
enjoyable EdgeZoomGestureDetector(
aspect: EdgeZoomSide,
onDragStarted: () -> Unit,
onDragUpdate: (Float) -> Unit,
onDragStopped: () -> Unit,
modifier: Modifier = Modifier,
curveSize: Dp = 160.dp,
) {
var heightPx by bear in mind { mutableIntStateOf(Int.MAX_VALUE) }
var level by bear in mind { mutableStateOf(Offset.Zero) }
val draggableState = rememberDraggable2DState { delta ->
level = when (aspect) {
EdgeZoomSide.Begin -> level + delta
EdgeZoomSide.Finish -> level + Offset(-delta.x, delta.y)
}
onDragUpdate(delta.y / heightPx)
}
val curveSizePx = with(LocalDensity.present) { curveSize.toPx() }Field(
modifier = modifier
.fillMaxHeight()
.onPlaced {
heightPx = it.measurement.top
}
.draggable2D(
state = draggableState,
onDragStarted = {
level = it
onDragStarted()
},
onDragStopped = {
level = level.copy(x = 0f)
onDragStopped()
},
)
.drawWithCache {
val path = Path()
onDrawBehind {
path.apply {
reset()
val x = level.x.coerceAtMost(curveSizePx / 2f)
val y = level.y
val prime = y - (curveSizePx - x)
val backside = y + (curveSizePx - x)
moveTo(0f, prime)
cubicTo(
0f, prime + (y - prime) / 2f,
x, prime + (y - prime) / 2f,
x, y
)
cubicTo(
x, y + (backside - y) / 2f,
0f, y + (backside - y) / 2f,
0f, backside,
)
}
scale(aspect.toXScale(), 1f) {
drawPath(path, Palette.black)
}
}
}
)
}
enum class EdgeZoomSide(val alignment: Alignment) {
Begin(Alignment.CenterStart),
Finish(Alignment.CenterEnd),
}
personal enjoyable EdgeZoomSide.toXScale(): Float = when (this) {
EdgeZoomSide.Begin -> 1f
EdgeZoomSide.Finish -> -1f
}
The drawing half is dealt with by the drawBehind
Modifier, which creates a Path
consisting of two easy cubic curves, emulating a Gaussian curve. Earlier than rendering it, the trail is flipped on the X axis primarily based on the display aspect.
This impact seems to be good but it surely additionally feels static, immediately following the finger with none animation impact. To enhance this, we added spring-based animation. By extracting the computation of x
(representing the tip of the Gaussian curve) from drawBehind
into an animatable state, we obtain a smoother visible impact:
val x by animateFloatAsState(
targetValue = level.x.coerceAtMost(curveSizePx / 2f),
label = "animated-curve-width",
)
This creates a visually interesting impact that feels pure. Nevertheless, we needed to have interaction different senses too, so we launched haptic suggestions to imitate the texture of a toothed wheel on an previous protected. Utilizing Kotlin Circulate
and LaunchedEffect
and snapshotFlow
this was carried out in a only a few strains of code:
val haptic = LocalHapticFeedback.present
LaunchedEffect(heightPx, slotCount) {
val slotHeight = heightPx / slotCount
snapshotFlow { (level.y / slotHeight).toInt() }
.drop(1) // Drop the preliminary "tick"
.accumulate {
haptic.performHapticFeedback(HapticFeedbackType.SegmentTick)
}
}
Bump is stuffed with many different modern options. We invite you to discover the product additional to find extra of those gems. Total, the whole Android ecosystem — together with the platform, developer instruments, Jetpack Compose, Google Play Companies — supplied a lot of the mandatory constructing blocks. It provided the flexibleness wanted to design and implement these distinctive interactions. Due to Android, making a standout product is only a matter of ardour, time, and various strains of code!
No Comment! Be the first one.