聊一聊Redis内存碎片清理


聊一聊Redis内存碎片清理

文章插图
当redis中清理了大量的Key之后原先Redis申请的内存(used_memory_rss)将继续持有而不会释放,此时查看内存信息将会看到存在大量的内存碎片 。那么,Redis的内存碎片可以清理么,该如何清理呢?
【聊一聊Redis内存碎片清理】翻看了Redis的相关资料发现,Redis4版本之后开始支持内存碎片的清理,于是进行了一次测试,内容如下:
1、搭建Redis搭建一个Redis,版本为4.0.14.搭建步骤参考历史博文或微信公众号,步骤相对简单,没有太多幺蛾子,很快便可以搭建成功 。
2、插入一堆Key,使其内存占用很大可以批量写一个循环,插入大量key 。
3、删除90%以上的key循环删除key或在创建key时设置过期时间,待key删除或过期之后,可以查看内存的情况 。
127.0.0.1:6379> info memory # Memoryused_memory:137040696used_memory_human:130.69Mused_memory_rss:11705876480used_memory_rss_human:10.90Gused_memory_peak:12091169848used_memory_peak_human:11.26Gused_memory_peak_perc:1.13%used_memory_overhead:3473184used_memory_startup:786648used_memory_dataset:133567512used_memory_dataset_perc:98.03%total_system_memory:16862617600total_system_memory_human:15.70Gused_memory_lua:37888used_memory_lua_human:37.00Kmaxmemory:12000000000maxmemory_human:11.18Gmaxmemory_policy:noevictionmem_fragmentation_ratio:85.42mem_allocator:jemalloc-4.0.3active_defrag_running:0lazyfree_pending_objects:0
聊一聊Redis内存碎片清理

文章插图
可以发现实际使用内存为130.69M,而Redis申请的内存为10.90G,碎片率mem_fragmentation_ratio为85.42,相当高了 。
4、清理内存碎片默认情况下自动清理碎片的参数是关闭的,可以按如下命令查看 。
127.0.0.1:6379> config get activedefrag 1) "activedefrag"2) "no"启动自动清理内存碎片 。
127.0.0.1:6379> config setactivedefrag yesOK开启后再查看内存信息 。
127.0.0.1:6379> info memory# Memoryused_memory:138029408used_memory_human:131.64Mused_memory_rss:5052907520used_memory_rss_human:4.71Gused_memory_peak:12091169848used_memory_peak_human:11.26Gused_memory_peak_perc:1.14%used_memory_overhead:3752728used_memory_startup:786648used_memory_dataset:134276680used_memory_dataset_perc:97.84%total_system_memory:16862617600total_system_memory_human:15.70Gused_memory_lua:37888used_memory_lua_human:37.00Kmaxmemory:12000000000maxmemory_human:11.18Gmaxmemory_policy:noevictionmem_fragmentation_ratio:36.61mem_allocator:jemalloc-4.0.3active_defrag_running:0lazyfree_pending_objects:0
聊一聊Redis内存碎片清理

文章插图
此时redis占用的内存used_memory_rss已降低至4.71G了,内存碎片为36.61 。
5、查看内存分配情况此时也可以查看内存分配情况,其中重要的指标是查看bins里的util,此时可以发现当前最大的已达到0.998(1除外) 。
127.0.0.1:6379> memory malloc-stats___ Begin jemalloc statistics ___Version: 4.0.3-0-ge9192eacf8935e29fc62fddc2701f7942b1cc02cAssertions disabledRun-time option settings:opt.abort: falseopt.lg_chunk: 21opt.dss: "secondary"opt.narenas: 8opt.lg_dirty_mult: 3 (arenas.lg_dirty_mult: 3)opt.stats_print: falseopt.junk: "false"opt.quarantine: 0opt.redzone: falseopt.zero: falseopt.tcache: trueopt.lg_tcache_max: 15CPUs: 2Arenas: 8Pointer size: 8Quantum size: 8Page size: 4096Min active:dirty page ratio per arena: 8:1Maximum thread-cached size class: 32768Chunk size: 2097152 (2^21)Allocated: 138983464, active: 149237760, metadata: 133846144, resident: 299532288, mApped: 5274861568Current active ceiling: 153092096arenas[0]:assigned threads: 1dss allocation precedence: secondarymin active:dirty page ratio: 8:1dirty pages: 36435:4043 active:dirty, 348100 sweeps, 737670 madvises, 4686933 purgedallocatednmallocndallocnrequestssmall:511734162794625727800354215995984large:20701184639706763967137682104huge:67108864754753754total:1389834643434407834197820223678842active:149237760mapped:5270667264metadata: mapped: 130830336, allocated: 115776bins:size indallocatednmallocndallocnrequestscurregscurruns regs pgsutilnfillsnflushesnewrunsreruns802952475343841822638936915121 0.72026002566101616996005702102565837734777916437253922561 0.4356171264807221362512428245929648532961417475492339343584415123 0.15210239810073216561108397132380018281803110771672511281 0.19510318831040440089388312850461015125 0.0197877911048518241012310085238131683812563 0.14863907039189566112006657636655633654283200605127 0.006113251843413021356474448022219012221206814016769562641 0.175386344380934702110580822431203241050321301112640545280391542565 0.7113630840673125562689691248384117352311605195632612130041161283 0.8751723422442904982811210257152497094494798378079229692567 0.99610382207001942529128111730562411482397961500956135243321 0.9821199419268749111491601251088067658967339618670453193251285 0.9971203620344527997419213867841631541627021048534528643 0.88280951990925388852241410507849374149327238826364691371287 0.990145342132772376532561517792012237712168210833469544161 0.98711216180947648102532016158720160687160191139508049610645 0.77582821993224588063841710828887166868845678128210323 0.881736418594269183044818520576311865310703235276116219647 0.95598852025748491277512193660801390771383621367087159181 0.98216677212031730775864020296320136674136211138756946315325 0.964878119327429411947682175417616405516307315463798262163 0.9891375119514102328538962250713614284014227410408156619327 0.9308831192974488123210242346080081240807907171545011341 0.9951141117538202041932128024967680142375141619399759275648165 0.98412880191628852974153625806400832408271513639655256683 0.994117441787510296108317922650713670931706485031028318167 0.9828140179154403113820482754067252915526514001126413221 1860416582264251888256028148736011594711536626887745817385 0.99415012199691441096230722910229767064670313588223338443 0.991104451718917546205035843016343047680776351660304565787 111190175429538117540963148742456959568404366811911911 191761686356959051203261542401454611442592675097120230145 0.998177042213836076148361443311980804538045185833151959823 0.994805016148227211759716834824320325183240313108451152947 0.99165981441680631715819235188006403552103529153629722295229512 1391034212835521001024036223232076168759506629521810925 111960173613791717691228837203980852860526944029616616613 1894116632528600143363819640323699236855246101376927 0.992687915285184221777large:size indallocatednmallocndallocnrequestscurruns16384396881287548750677604220480408806409391934812935614324576416881286279625163672828672426021124187416642612132768432818048179371785118430864096044143360063064506306415630645035491524512779525117509151172657344468601605916590159161565536473932162953294729536819204813107208244822882441698304492949125037503450373114688505734404224421942245131072513932162233223022333163840521638400279827882798101966085311796481740173417406229376542293761162116111621262144555242888728708722327680563276801290128912901393216570782782782045875258052052052005242885952428840039940016553606065536060660560617864326103713713710917504620196196196010485766320971522242222242131072064131072032832732811572864650179179179018350086608383830huge:size indallocatednmallocndallocnrequestscurhchunks2097152670110110110026214406802632632630314572869086868603670016700919191041943047104545450524288072055555506291456730262626073400327401515150838860875013131301048576076088801258291277013131301468006478022201677721679044402097152080022202516582481022202936012882022203355443283044404194304084022205033164885022205872025686022206710886487671088647671------ End jemalloc statistics ---


推荐阅读