最近研究redis緩存發現Spring雖然提供了緩存的失效時間設定,但是全局設定的,靈活度及自由度不夠,是以就決定自定義注解開發在各個方法上設定緩存失效時間,開發語言為Kotlin.
自定義注解:
@Target(AnnotationTarget.FUNCTION) //方法級别
@Retention(AnnotationRetention.RUNTIME)
@Inherited //拓展性
@Cacheable //元注解
annotation class RedisCacheable(
val cacheNames: Array<String> = [], //Cacheable Builder caches
//val key: String = "",
val expireTime: Long = 0 //失效時間 min
)
如果您想拓展Spring的@Cacheable設定cacheNames則在您的自定義注解裡須定義cacheNames,這樣SpringBoot才能在建構緩存時自動将您的緩存名稱指派給caches:
2018-11-332 12:43:32.935 [restartedMain] DEBUG o.s.c.a.AnnotationCacheOperationSource - Adding cacheable method 'findByName' with attribute: [Builder[public final
spring.data.redis.entity.Person com.sun.proxy.$Proxy105.findByName(java.lang.String)] caches=[nameForPerson] | key='' | keyGenerator='' | cacheManager='' | cacheRe
solver='' | condition='' | unless='' | sync='false']
自定義value值加@AliasFor("cacheNames")事實證明是不起作用的,請跳過相關坑,其他相關參數可直接定義(eg:key)都能指派.說到這裡可以了解一下AnnotatedElementUtils.findAllMergedAnnotations();
注解應用:
interface PersonRedisCacheRepository : BaseRedisCacheRepository<Person, String> {
@RedisCacheable(cacheNames = ["nameForPerson"], expireTime = 1)
fun findByName(name: String): Person?
}
在這裡需要說明的是:自定義的@RedisCacheable沒有給出key的值,[email protected]會預設取方法中的參數值,與cacheNames組成Redis中的key.上述注解應用例子則極有可能出現key重複事件;
解決方案:1.參數展現該類唯一性(ID);2.自動生成鍵值政策;3.有需求可使用RedisList
測試類:
package spring.data.redis.entity
import org.springframework.data.annotation.Id
import java.io.Serializable
import javax.persistence.Entity
import javax.persistence.EnumType
import javax.persistence.Enumerated
import javax.persistence.Table
@Table(name = "kotlin_person")
@Entity
data class Person(
@get:Id
@javax.persistence.Id
open var id: String = "",
open var name: String = "",
open var age: Int = 0,
@get:Enumerated(EnumType.STRING)
open var sex: Gender =Gender.MALE
) : Serializable {
override fun toString(): String {
return "Person(id='$id', name='$name', age=$age, sex=$sex)"
}
}
注解處理器:
package spring.data.redis.annotation
import org.slf4j.LoggerFactory
import org.springframework.beans.BeansException
import org.springframework.beans.factory.InitializingBean
import org.springframework.cache.annotation.AnnotationCacheOperationSource
import org.springframework.cache.annotation.CacheConfig
import org.springframework.context.ApplicationContext
import org.springframework.context.ApplicationContextAware
import org.springframework.core.annotation.AnnotationUtils
import org.springframework.data.redis.cache.RedisCacheConfiguration
import org.springframework.data.redis.cache.RedisCacheManager
import org.springframework.data.redis.cache.RedisCacheWriter
import org.springframework.data.redis.serializer.GenericJackson2JsonRedisSerializer
import org.springframework.data.redis.serializer.RedisSerializationContext
import org.springframework.data.redis.serializer.StringRedisSerializer
import org.springframework.util.ReflectionUtils
import spring.data.redis.extension.isNotNull
import spring.data.redis.extension.isNull
import java.lang.reflect.Method
import java.time.Duration
class AnnotationProcessor(cacheWriter: RedisCacheWriter, defaultCacheConfiguration: RedisCacheConfiguration) : RedisCacheManager(cacheWriter, defaultCacheConfiguration), ApplicationContextAware, InitializingBean {
private val log = LoggerFactory.getLogger(AnnotationProcessor::class.java)
private var applicationContext: ApplicationContext? = null
private var initialCacheConfiguration = LinkedHashMap<String, RedisCacheConfiguration>()
@Throws(BeansException::class)
override fun setApplicationContext(applicationContext: ApplicationContext) {
this.applicationContext = applicationContext
}
override fun afterPropertiesSet() {
parseCacheDuration(this.applicationContext)
super.afterPropertiesSet()
}
override fun loadCaches() = initialCacheConfiguration.map { super.createRedisCache(it.key, it.value) }.toMutableList() //建立緩存
private fun parseCacheDuration(applicationContext: ApplicationContext?) {
if (applicationContext.isNull()) return
val beanNames = applicationContext!!.getBeanNamesForType(Any::class.java)
beanNames.forEach {
val clazz = applicationContext!!.getType(it)
addRedisCacheExpire(clazz)
}
}
private fun addRedisCacheExpire(clazz: Class<*>) {
ReflectionUtils.doWithMethods(clazz, { method ->
ReflectionUtils.makeAccessible(method)
val redisCache = findRedisCache(method)
if (redisCache.isNotNull()) {
val cacheConfig = AnnotationUtils.findAnnotation(clazz, CacheConfig::class.java)
val cacheNames = findCacheNames(cacheConfig, redisCache)
cacheNames.forEach {
val config = RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(redisCache.expireTime)) // 設定失效時間1min
.disableCachingNullValues() // nullValue不緩存
.serializeKeysWith(RedisSerializationContext.SerializationPair.fromSerializer(StringRedisSerializer()))
.serializeValuesWith(RedisSerializationContext.SerializationPair.fromSerializer(GenericJackson2JsonRedisSerializer())) //友好可視化Json
initialCacheConfiguration[it] = config
}
}
}, { method ->
AnnotationUtils.findAnnotation(method, RedisCacheable::class.java).isNotNull()// 方法攔截@RedisCacheable
})
}
private fun findRedisCache(method: Method) = AnnotationUtils.findAnnotation(method, RedisCacheable::class.java)
private fun findCacheNames(cacheConfig: CacheConfig?, cache: RedisCacheable?) = if (cache.isNull() || cache!!.cacheNames.isEmpty()) cacheConfig?.cacheNames
?: emptyArray() else cache.cacheNames
}
如果您基于Java語言請忽略此處,kotlin data class 反序列化時則會抛出LinkHashMap cannot cast Person,這是因為Jackson反序列化kotlin對象時,您的Any::class.java是不能被反序列化的,我們可以在build.gradle 加上:
allOpen{
annotation("javax.persistence.Entity")
annotation("javax.persistence.MappedSuperclass")
annotation("javax.persistence.Embeddable")
}
注意類導包的方式不可javax.persistence.*替代.
如果抛出Person cannot cast Person 異常,則很有可能您開啟了熱部署:compile("org.springframework.boot:spring-boot-devtools")
解決方案多種我在這說個比較簡單的,在您的啟動類應如下配置:
fun cacheManager() = AnnotationProcessor(RedisCacheWriter.nonLockingRedisCacheWriter(redisConnectionFactory), RedisCacheConfiguration.defaultCacheConfig(Thread.currentThread().contextClassLoader))
這是因為JVM類加載器與Spring類加載器不一緻導緻.
啟動類:
@Bean
fun cacheManager() = AnnotationProcessor(RedisCacheWriter.nonLockingRedisCacheWriter(redisConnectionFactory), RedisCacheConfiguration.defaultCacheConfig())
注意開啟Spring緩存 @EnableCaching(proxyTargetClass = true)
控制器:
@RequestMapping("person/cache/byName")
fun getCacheByName() = personRedisCacheRepository.findByName("asd")
啟動SpringBoot 通路:http://localhost:8080/person/cache/byName
堆棧資訊:
2018-11-332 13:50:19.498 [lettuce-nioEventLoop-4-1] DEBUG i.l.core.protocol.RedisStateMachine - Decode AsyncCommand [type=GET, output=ValueOutput [output=null, err
or='null'], commandType=io.lettuce.core.protocol.Command]
2018-11-332 13:50:19.499 [lettuce-nioEventLoop-4-1] DEBUG i.l.core.protocol.RedisStateMachine - Decoded AsyncCommand [type=GET, output=ValueOutput [output=null, er
ror='null'], commandType=io.lettuce.core.protocol.Command], empty stack: true
Hibernate: select person0_.id as id1_2_, person0_.age as age2_2_, person0_.name as name3_2_, person0_.sex as sex4_2_ from kotlin_person person0_ where person0_.nam
e=?
2018-11-332 13:50:19.725 [http-nio-8080-exec-1] DEBUG io.lettuce.core.RedisChannelHandler - dispatching command AsyncCommand [type=SET, output=StatusOutput [output
=null, error='null'], commandType=io.lettuce.core.protocol.Command]
2018-11-332 13:50:19.725 [http-nio-8080-exec-1] DEBUG i.l.core.protocol.DefaultEndpoint - [channel=0x63db3443, /127.0.0.1:53613 -> /127.0.0.1:6379, epid=0x1] write
() writeAndFlush command AsyncCommand [type=SET, output=StatusOutput [output=null, error='null'], commandType=io.lettuce.core.protocol.Command]
2018-11-332 13:50:19.725 [http-nio-8080-exec-1] DEBUG i.l.core.protocol.DefaultEndpoint - [channel=0x63db3443, /127.0.0.1:53613 -> /127.0.0.1:6379, epid=0x1] write
() done
第一次查詢直接查庫,并将傳回結果緩存到redis,第二次查詢就直接從redis取資料了.
可視化:
如果我們不使用allOpen插件的話"@class"不會被緩存到redis裡面,進而Spring Deserializer時會轉換異常,這裡多說一嘴哈,kotlin所有函數及類都是預設final,而Spring相關庫都得需要類是public/open的,so you see.
補充說明:有同學問為什麼我隻開發了function級别注解,為什麼沒有class,我在之前的文章中有用到@RedisHash這個注解,在這裡您可以設定緩存名稱value及失效時間timeToLive,咱們就不做無用功了.