To understand this property we probably need to learn about “function memoization” first. Function memoization is the ability of a function to cache its result based on its inputs, so it does not need to be computed again every time the function is called for the same inputs. That is only possible for pure (deterministic) functions, since we have the certainty that they will always return the same result for the same inputs, hence we can save it and reuse it.
Function memoization is a technique widely known in the world of Functional Programming, where programs are defined as a composition of pure functions and therefore memoizing the result of those functions can imply a big leap in performance.
Positional memoization is based on this idea but with a key difference. Composable functions have constant knowledge about their location on the Composable tree. The runtime will differentiate calls to the same Composable function by providing them an identity that is unique within the parent. This identity is generated based on the position of the Composable function call, among other things. That way the runtime can differenciate the three calls to the Text() Composable function here:
@Composable
fun MyComposable() {
Text(“Hello!”)
Text(“Hello!”)
Text(“Hello!”)
}
Those three are calls to the same Text composable, and they have the same inputs (none, in this case). But they are done from different places within the parent, hence the Composition will get three different instances of it, each one with a different identity. This identity is preserved over recompositions, so the runtime can appeal to the Composition to check whether a Composable was called previously or not, or whether it has changed.
Sometimes generating that identity can be hard for the runtime, since it relies on the call position in the sources. There are cases where that position will be the same for multiple calls to the same Composable, and still represent different nodes. One example is lists of Composables generated from a loop:
@Composable
fun TalksScreen(talks:List<Talk>) {
Column {
for (talk in talks) {
Talk(talk)
}
}
}
Here, Talk(talk) is called from the same position every time, but each talk is expected to be different. In cases like this, the Compose runtime relies on the order of calls to generate the unique id and still be able to differentiate them. This works nicely when adding a new element to the end of the list, since the rest of calls stay in the same position they where before. But what if we add elements to the top, or the middle? The runtime will recompose all the Talks below that point since they changed their position, even if their inputs have not changed. This is inefficient and could lead to unexpected issues.
In these cases, the runtime provides the key Composable so we can assign an explicit key to the call manually.
@Composable
fun TalksScreen(talks:List<Talk>) {
Column {
for (talk in talks) {
key(talk.id) { // Unique key
Talk(talk)
}
}
}
}
This way we can associate each call to Talk() with a talk id, which will likely be unique, and this will allow the Composition to preserve identity of all the items on the list regardless of them changing positions.
Given that Composable functions know about their location, any value cached by those will be cached only in the context delimited by that location. Here is an example for more clarity:
@Composable
fun FilteredImage(path: String) {
val filters = remember { computeFilters(path) }
ImageWithFiltersApplied(filters)
}
@Composable
fun ImageWithFiltersApplied(filters: List<Filter>) {
TODO()
}
Here we use remember to cache the result of a heavy operation to precompute the filters of an image given its path. Once we compute them, we can render the image with the already computed filters. Caching the result of the precomputation is desirable. The key for indexing the cached value will be based on the call position in the sources, and also the function input, which in this case is the file path.
The remember function is a Composable function that knows how to read from the slot table in order to get its result. When the function is called it will look for the function call on the table, and return the cached result when available. Otherwise it will compute it and store the result before returning it, so it can be retrieved later.
In Jetpack Compose, memoization is not the traditional “application-wide” memoization. Here, remember leverages positional memoization to get the cached value from the context delimited by the Composable calling it: FilteredImage. Meaning that it goes to the table and looks for the value in the range of slots where the information for this Composable is expected to be stored. This makes it more like a singleton within that scope, since it will compute the value only during the initial composition, but for each recomposition it will retrieve the cached value. But if the same Composable was used in a different composition, or the same function call was remembered from a different composable, the remembered value would be a different instance.
Compose is built on top of the concept of positional memoization, since that is what smart recomposition is based on. The remember Composable function simply makes use of it explicitly for more granular control.
This post by Leland Richardson from the Jetpack Compose team at Google explains positional memoization really well and brings in some visual graphics that might come very handy.
👨🏫 Fully fledge course - “Jetpack Compose and internals”
You might want to consider attending the next edition of the highly exclusive “Jetpack Compose and internals” course I’m giving in October. I have carefully crafted it so attendees can master the library from the Android development perspective, while learning about its internals in order to grow a correct and accurate mental mapping. I wrote its content and I will be your teacher. This course will allow to position yourself well in the Android development industry. Limited slots left.
Stay tuned for more interesting posts on Jetpack Compose, and consider subscribing for supporting this newsletter.
Jorge.