From 1b5ce651be4713ae4722080fd442b54d91c5a82e Mon Sep 17 00:00:00 2001 From: Jay Bryant Date: Thu, 10 May 2018 09:24:43 -0500 Subject: [PATCH] DATAMONGO-1974 - Full editing pass for Spring Data MongoDB. Full editing pass of the Spring Data MongoDB reference guide. I also adjusted index.adoc to work with the changes I made to the build project, so that we get Epub and PDF as well as HTML. Original pull request: #559. --- src/main/asciidoc/images/epub-cover.png | Bin 0 -> 66019 bytes src/main/asciidoc/images/epub-cover.svg | 12 + src/main/asciidoc/index.adoc | 59 +- src/main/asciidoc/new-features.adoc | 49 +- src/main/asciidoc/preface.adoc | 59 +- src/main/asciidoc/reference/cross-store.adoc | 32 +- src/main/asciidoc/reference/introduction.adoc | 5 +- src/main/asciidoc/reference/jmx.adoc | 28 +- src/main/asciidoc/reference/mapping.adoc | 102 ++- src/main/asciidoc/reference/mongo-3.adoc | 39 +- .../asciidoc/reference/mongo-auditing.adoc | 11 +- .../reference/mongo-repositories.adoc | 136 +-- src/main/asciidoc/reference/mongodb.adoc | 865 +++++++++--------- .../asciidoc/reference/query-by-example.adoc | 16 +- .../reactive-mongo-repositories.adoc | 67 +- .../asciidoc/reference/reactive-mongodb.adoc | 140 +-- 16 files changed, 828 insertions(+), 792 deletions(-) create mode 100644 src/main/asciidoc/images/epub-cover.png create mode 100644 src/main/asciidoc/images/epub-cover.svg diff --git a/src/main/asciidoc/images/epub-cover.png b/src/main/asciidoc/images/epub-cover.png new file mode 100644 index 0000000000000000000000000000000000000000..7be1d08b3e38461536357ea31bde85587cbd0c2e GIT binary patch literal 66019 zcmdqJg;yKv_dQID7k4QdT#CDv5ZsCtDDJehcp-RkDDJMMXmR)A?he7-o#4FTbMO8B z{)cyEC0UbsX6Es8&OUo5k{1*iS!p;o!b=z`^Ys!NCb7!NC#Pr8TOGz)m1oN+?Od!Bs?f+^fdI zj$b*d$pGO>N67YI2S_FgG9b9;mseIx{>y1pJ6Rn^I5^C%m-nl97R$HB>gBs9Pa zhb>qTCG%EjTRvxcZcnC!F>w4Q(_0RSnoLnd76}gv0~3Q>M1Y4_M8IjP9S;wg!OMw* zOF)2_#9u;fFHgWG04QZFuV`xFwd`U!?P+`P$^EEhW&!pgelly_R)5lRQZxHq-&+6R z&a3n!^W@>k);ko1+8-PEzmJ=CrTy3c?Hi%B{{Q(vn)qaG{t zVQt&+`N1HaWtNTT{_dG!U$5obd>kC9(`;85OZPKQsQp<1pUE=^v?i>K|6~vyy7lv; z2g}LM0sB?+zkL}DiWEBslL7@yCMr+$i)KzfR1NMh8`L|L0G-W2$KpJy8?P`kXHYt~WJ}Q6&?glh_wNl+7-)HNgCQ!IyrXyoBU?PZ zy3{zM{#1gFd!TOyL1GIl_Kvh%87ph3str!zM_aAlFL&~MCuT((@b4WqWq^_~e@&@( zhOE;&m|5#zu?Igg)Ipai=a*A6ZJC@7et+7c&{l6S8$PNT;L)RrME-C1S8f+qp7IP( zP=M41!dLfV4Lg|py~0$JiwgDK@LQAy8pVh3)^R%$|4yY!JX8!ZznzYVjf24LG_Ml% z4}ydfuN0iXS-?=KbEp4$=uiUahxLH~uyfd|EbK6gHHF)$|j{aXPuLR`}I)tcD0lV7JO-hIUe_K%u{W>@4W znDGj;kWvGje^y2Cr3}yy4V##+rOE`z$KZngrWv?>csI?5iuq3glxPwUmkxi_eMESF zSyqVfx8V$^H~;E>4SQ&zjPFKA;tun%cuk4QaYGs*Y;YkN`(&83HxM_ctKEU$v>yM} zTSV5XeC@Iudhp;~CR*ol5xo-j{GdW~q7g}pg%G3_Sw?v`pGdh9zIxhnvh2Z2uyeK@ z)YOPaKY6%}$<53BZ(Y_X0+cuX^uNZ4w5Bu}2fNYKM78;Rq4-w(J#*}E`C5+uy5s{1 zDBq#`*h_&1@Ne%PN6Bt;LSsfI-5Wk{WO3`K)LR<==MuyH^T1neg;EC@!^%Q z&Zalnjd!T5Z1S_qo6P@K5Y$Y67c{cxnVx6+C~nI{BVNb*IMIMW4~* z-(IAK#A;^ab=?%_fVSTUcUHAgK3U{{5~tjjG5} z`E(k}-dcZ@+MoL&+gki#4NH7ev%z6Y@9N+6J1_SU1;a2%UQstz{Mx__)7s(5vu97d zRV&}Iq*W4aMQ_st(3>@Z;Oo2h2yqNbwJaVxy*nsdY4g0l-Y))FvhT3o74nnP^Kv}B z;wewN!i4Cg`C=sUQ_Ytv`6`QH=W5#(ug7 z@Vgkc$z(-Au%abUDY;W$_7^eFGY~nhR1XWvhklc*cHd=mbK52shwV_TRsw+XV{0U( z0p4E6bg`DLPPvgpn@*JlM!P$8bP%A7MkYcdUDW;Bh{9~C$;D9Y%T$47(!=*|H$x}I zGy(hOC3n^|I-cD8H49R8abui*(VJC2tmC7(ij9Ts;Z$3dS{!0$Z1tM=S0`>#cH3?W z+qd>M75+%2o~RgrrOaEeP%Ln+LtE|VNejn(<@6Z@YDI|_5#T=wJdgeEQSP=ldT$6EG|-B{z! zz~SA+x)UHt)>DN^_fMOCd{XS6ryc5}XgHL!rTQ#gNI2{>)_*4FXi2#(hVYvaCQ3xX zYoDxDK-cYl$fFeEUJoEt%zhfvSP3=J_W;RjA0rOS*|J)kH*4D$n+Cxni{p)7W@T?< zGy&bVOscC@f|gvuOte2;4`xd0M6RaPNAH=qKz{k4l20`b+EWKV2>@4UoX!cyEzo6C z_O^#})A>2%&GfzTU%iz`7^H_QHP%{q`UFpvUXORz{ZC?D-(aS1c{adtXN0`(!uWi% z-xLQhQS&bNa57Kn?NasU$#fAYPa`fV?rZ*=`Lc+m2$q&Wes`Uhje_<$?5bgRbanW{ zZ^f48=UGPmvaB&XXWO1P7$|6pt!2&U^2KIehLuEFK$Hv7`5Nolcw)$HY_kOyYs&l{ z@J6rI!_`F}vHkrGdFe#|U2CF&+n)VZBleSz9ls5vUM`%JcOh0*;9HVWY_ZK!qboi>0Wp>w@3=cF?dj zH1V}B#WR4mvfkmMuB(oQ@9Sc1i4;9-cu4^CZ*W;Se;G#gi={6$1q2hZj4C&KyTG;r z2MEiM-=BHNNRS6+h4J&#BZe=dc}CyG@%h%E6m~BT<{!{zMTUz9Ay(@<-P-DpKr>W~ z5&l7#Rf|IS5KreX_caU9y_y%BqH6-5fq7d-gb?5iy=q*uMa`W1z8`NJ5IC7k$CIVJ zA@tQPbLe1H{P{s0%+ke;6FZ$j6ykysX*4@lEjfURZ=Tu{K`yMLRV>W$1SU>hR>v|Y zWNoC^>&5CM^|M`Wc7>mdcHrv6M$z#v(QBwICN%C~`FP|1HhTfB|2z#Fpa}G%97S%{ zou~TL47Rt;z*BZr{*z`9j1d=$zIYK~`SxH@&~Y36Q=`*omB}6K>9h4tVJ)o~f20s} zfbluA+96pTdBs{y?BX2+n@jb=TgIm4_WJEpsoGu&qJJRd0=9#zwS8!vF#CFXY9bTs z-Qpp0e|~f(&?JgL96YprX%U=#*zCy;-%_171WKSa*tf$G~Wlr&F=XhTJ5qs zRvw0G0<|RRFOg%}(8MRi-7PBA*vxS6Ma`W`*;nR6kjq>J=?p! z46v!B8$17gcn~9WC!4(#J9hh7LI7~ZwvenTQbaXMY*3X4<^MW&8Z4~~ zl$3}z&Ici_rHf&ORbwPGa!ABme^}i~n=CW;m+4&V46;rN!_q7EyHSp?mZ$K4o5B16 zCEBB})AyiY&vV^b^XJEUME2LQTAf9j#f=DO>6`d#B)E_Fy_DV$bm76+A%3%OM^Ps4 zH&;@)%m-@0a#2Nuhk;1F}178FDiWE@FoC+3xy^lx58#g8(DorbXKmF6mwrwc~ z{e#z1LO~c2{p~Awq#^E>vVMc$DGu+ahbLvW?XHTVM1Nfh*Ue}Q=Qf%0Hyw`Ox_=TZ z<#-3(yNvZS1oR=DB2l^c?Rf$XI}LD?Z&#o0PrOAluFCs>*~V+jh1RrOC}mNJYis-Z z`rfV5g^h?_BZ9F{1{irkz%g_B#oBL8{g=xy3j^O)r?YwfVT~Z~+}RAgoi>qOTJ^oO zxudc^Yvm88CW`Wd-AdbO2bxNXvxl84me4pww%*M5cs8i7AyhT5W>%vx*gsstUnMG& zI>S+wab|;*d>v3n zF>+yNJ%#z-yBF6uP@^Y~J&VTlMKQ+0W z*nB1VJC26v5Eg~-6Kp@ivap-ob_D#+p{y7ht%&U6GLXkL<9wn{=-3lR()pmBH3WT50+w`l8Cyr1bS!gdZ zG}SOZ;c!6XLSIDYFpPO5y58w-m@rx4=&4euLO42vKKktU2?vm3DXnn zrOx4b4lCh~g7}?IWt?*?_~!*DNyzPDU+9JL8vl!Dc|JQJ?NNz_oTZ^v`vzmI&UPh$ zKHKMDJap`@=rzP~6eIs+LyjV_ybJ9Y3X%Rc+IfqeX9GD)mx!?GPy`|x;pK+8i)}Xz z4@cyL!FD!S#QTO-8jrp7`?h?k{b^gFdz=mxX6*+w8s(m>Wcn8ew_Jf~ zx9fhkWrL1EDYMM8c1n%cGM#}K&@x#Lg5YV6F6 z1B9N}L)pdFzQ3~K8c5~));JlCj4X{jKXrkI(f^~>eZMe2L!~P|;VF~v+k#bT+4qB) z@295rKcfY{&XtpdnAP?CJ(@0_jTuhmuUl(Wu3K6&GjaVjcjcG^pv}Ra#g+D@Jnl|= zYtFCpurV^Y&adbx>&_|0nPY-#az<pBC^9(*Aub-ekXVo zj(LwXFEc|H6cx8i?8r{lb!x&u{7Zn51|yO*BcuZ4GU+K*U)IKa6Jj$k)w8Yoy*XiKlEt}}2G2P!w&p3#4m+AP$qJ5D9;_v+7tgm zpr^9*||C&7NH<3o+O_W&9&!5v|fK%?Ru?> zpLnVPeS0Kj_FfP5+#P$UW6Xd*>6~`;_f{{NDG^_L)v$*2Jd21f$AX`1*iY5L$Mt;9 zfm^QUyX3=&?-DsZq5ycuM#E8}43^;6UUV@QdL({S%tqteBk96r%CkyjXbdW}`88Ta zoSMoVp^QTY#CysGk_;bVAV?mwVlYQA`d?0g<#=nYbFbO~y#z+JJ5b{_R)}YsGd$-{ zov}(`kj{WvSF;Suf?v8lSRATOamR zjvy(awg2$k6?|#Hq4G?r{(aVT5b;NUnsVKZ>Z;xy?KIqY{o5I)8=vB}dI74Z(dU%nAhFg!O?OOJc9fxVCVgxh9E(o zgFwieG$PMlk5<@jk}$M0K8q{vepOkXp$`5efcgvUF>XuizDZWnY6{U-;Alnh$$P5e zW3Ok3Y4BNP7@F4Gadu>Y^3Liu_q+K89&}b4cP)Hai_>A`*-TrW-J6DBNb-OH=m0;d zE43>C@`zWH!S5k+A+5~!-RazxBUDX?_k7*RM}NoIevWLx>bv=)@6Q}x_(u`Pdt0SADw=GA}&+>f2v3*21ID~k1 zY-bE77(z|^ms|vV`gPKmud(Hl{D^mWn79d2X1o&etA*IYIn)DzfE@^rRFQ7&j&1%f zd6om14noQZ&%^yObd*NKUW#}wteDJY9+qA($lLE?+~?kJYf2;&X*>kw=3iKHul9ku zF4I^fFNCg*o-JMT_O}1!3%R`AD#)B0EXXD2|n>hP&`xw551rnKq-yGpv6QkGAO-q$%%`d8W{Yu)_R4oBG@ax$3Aq zvf`1>(LF`NkFrmk26Kd!LQqUtURah4I3QHwIH5L^|cpp$s(@j+6T7t z_T}B=d1Fg`==KN*hYX{;};TdKc?{`e)aYm|$a zm?+=c7mAav=!j`bWJ~CL@MPv=9v3v^wN=BZbPofi%R1Ip|Xr|*PiKSa6XGd@( zw4OLeV{6UTl%Z&y+HW?HlI@xd9$u6}HORjG0)?~7DYFSR=-0F21=|Hm$1h!#x(DcV ze=Gyb4K)Q#Zu!AcP#^Y^_n#73?awaH4v%;#q)r5-8YmbB5u(;Jf2cR@LZBzRrgi4{|Cam1=g-QgitZ2`ZX$Q$_{Xk?mah8qVap(^P@#C^*HhEK>A% zn%<9|z_WX#bWaa2JqS`gd*Cak!!eqCj$LYlZwoAIdlAi~*&*!SGl7>gIBbfDtO|FW+4ZLgLL#V?aXa7o~gi zlQ?;G+K6Na=9St0Rj#F^#%5*ZV6_*daJVbjzZAf{?zsXoweWJ!0+5}4wKmxQ;|j3wXs~6u>Veu(*EE~ z%=?wNgynVWr-J7xmWvDI$2-fyAp!6(ngMC$Utk(PUBz&S+r?2k8tHFqX;cisVea3Q zC377xr)p{N^?*tp7ll}TrWy;G)=-u+rRxvyym957@mQ9GqavDkY-+Wx>FmKRB5Gjs zEP|#Pm&+%x(BnJdc;_B~>ADnLzik)-PUpK;{d&g}su^ReDr>WMQ;!bTF1V>N-@w8( ziSnWnb^HV!@;N;T=@r%iDqYqOi*!Qh0DCroUo5|$;hK-~>jHhYpWxG!-p;}WwLiCM z4}d&!8{8|mBmNmi`bbKCJMv_Q4>vkzAy;v7<)=ce9isjVEF>F(n4n6g)*4ny5d~_u(taXqjknQBuI#c4JlAf zdPJu-+W8x~QNgh-;EY8>$Z%N7;Gh=<;YiHApt&{svt-B6*}v(0Vp*v=up+4W35SA^ zVD^Nh32YG-uBg};|LsT#Uqpuka#-1>ZXl51A>`a6?s?R`%2Q$5i@9-;zEA7r8AEe1p*?vH z_!`hmw*W>-_yFe7yWMxFm_~l~%nOw4l04Iu-=y#D`DqFXDaqh8u>1v6Z%kk zoVM$a{6G`?xv+S{$tDE@qGh5!uiuPBOVE*h$~D@64ddk`5;gYN6<4 zQq}ll?OXYo-u`bHACl^mau**@XG$#SNC{KF(MiV~dJo3<+>}NvlJARcr*wgHQLhER zeAEmA19Jdvl=izJBRdJImQeR5u7r>dH(FJiC7#ym`M6YosvDru>ov{sI%dP0X#+x1XJJJH5n*P{A}5hU)59y` zkO&$quUO$wZWi;;E0jo-w!t)!B zP`k*1g=KWZ>09C$t^5BA)VSYYimp+`vNCYB3yY;O)-pfFm5An6OBZ=26%lxZ>$&yJv?huPSgHKCMX-ofLWShkM{#!{vAdA%6t#$D3NBaSA8 z$8NFpMl+hnW28LCX?}!bcIlGA8y5|k<6^&7au`gE8u3?Vo374 zb+`s*Ow6ic2YG#>`pMeC=3n&QInsR_-n{1GH)U0yV80VBQmN-$KI)(sD5a#M+Gs#d zJ0All$FB1Ro2ph@jVlzUjVhUxu8d0DvEgL5^YkdYGBv0DVj_!GOLr6VPzrcMpnOO0 z5&n-{Z}?mO{=z#Q7@ohwEmkBS?*r`IuOnhM1J-jNhiZj5wRzjEYSQVEWSQ?1-00j& zwenuN;#3WFB#WR=Nm^*J&G;rGnhhqrtE2zqNbxX2`<2T^N90VN*3XM8jE5a;AA9w) zU)|s9`o6hC470GO|5tli#i}reH1@G}z1PE8uL~91>3NJD@1cqBv~(C63khJ>i;tGU zdNOa=LO9#U(m(Ylhx0Od{Ti~pxUg#}Vss6~#-drUL8B8Bb=iu>$WA{#3*H&N6X}Y> z)K`oCdpD5+SYZYb#n2$~-zs22jAKH*yc_=6-Sc89_AOVE8KjIn# zv7VY(X*Q^D!eCIz?*{+nC*SCe&SQ(M5klV0o3D2RQ$9`8zIObi_)BPmFL4)i0)-#> z%-Qe9hm%eH6>E*dC;RiKxVy*GSpnKwxcn7dax~{%9Lh6r$JjMDzXGwu`}QC7`Dnhc zzl~fg?rc&U8HBg??6MdrXWKU0?p$qZ@y3Er{HQXL=_PrF=-`zENJhQTFhas)Dtn3g z4x&*vQorJiV4>Wa4+lbhm+pt&ZCamJ&eH+udH#&3zLs563|IS3lfM5Tf^UF64pGmM zm&m48IOm~#&GwnS`LrW8&Rvq1FM#5-yt|5eTw5@IVFb`Zlm~D%!oF{SbJ=K2kz4Ih za_^@vD`xsKu2V2pbA7Z}mMQFV{C5{Bd%jD{V@EzuvLiv~^8L*t8gmzGth0KjSf;fH z-btNCFpt-d;2)=hLX*dyB6+ktHwptq${$9~hkoPL@FH|XH2=g6?~%UaB`1=Pe#IQm zX2p`(dXq7*)%}AQ^5P~s56q;$3t9F@VNv#dFT(l9pGZ|R4;IDgK46o~g?Av%l%|;x zXSImBU24zdNQHibVHd`>wmBl_eKc-+NubxR^Vg0u$sFZ`H8ykg@a*?tHk`^xyexD9`vN=?PR(1T6|+XYQ!_sYQzUD zapZN@O_XWGNl_M!M_*m9WRQuZN)A}mpG&7A$=+*$r8+2`9MPL&Q(kbbe&LSctM_Ug zxF^(hQCS#?)?r}v>uZhZZ%0?JDKJEkbC$YGJjk65F#I#-=xmQ(QoKtQd(f~;cjZ}+ z43V@MD+vwLv$xxt&0m%J^Uq{yPC})H!UK?+N!auRwIr6<&tvyv;?Fu<*In(`_AOn- z)pO#RnYVr}k?)5jfb@_t$qFLBgj&;}nG$-i+A(n^hdGuPY0e$p>CkoK+P66}H^iYR zAN$DD?~1!X?0b8i+jc%}HJ++N&eU>vVvUulB7OHf`EB+gZD7uN-o}!1r^7(iJw(aa zXI-6z&SlH_83XZEp~_-I+c%lz3XDS|g_-8I8_63ajtrwB?6WaTCzG7{VUvRQE!Js$ z9u(Fdekm*yo*sqRrf!jDWqV^t+lH}YSv3+!C>SpsAI;g&hu4yeDNB-X0c%~N-NjOv zX-kj8qxtLshE{VGrZU{ZM4IX?2U)ulzXzFP-8q}ILPWri5vaUJjbOQ8h9QA>5Z9Y~ z9bBbbszV7gjw4JHlqM3t5Ilwm7v@)?M*cp1Ir#0IOBjb=t;;AckC)sl(#QT*eV)jb z+sN^@B~l}Dn)8oE17imk&=oASXywXA4<_?v-rwqnXl%(|-w4$RlPu?Y{aqOPToPQT zCRE*7C?5i3n9#GL;%;-`4t63UW*s1WXZYT#Pt?2M7M_5@0Ds$&kK~ltKm;K+o^!6y zqFz0QzW27zX6a+@7qK@v4@Hv~gL|$dABhds8%8>V zux3v$n$rARK_ih9RbJ2_O~S=vQIXUMV?`Wfn3<7h=XP9#E)TJC*B}`Vp}HR~fweWWDvd5Dt&n5r3tc7V z*|O;aMOassi^?tLWNDo`MJ!d=R(S%t(`y=vjmcxtHc`gpCfr-$$tsaxJyqpi#h^1O zTI1G>a}N|WWeSeHb$vi~d*CRWx7yjwtxHPbVilGHNpw zYvQwZuN^6a^Yx?McYasW-@dTfl9^PbvPvEEELaUbTVHktNO5(jC>m%uQxFPhbHjvEcXrT?|aUXge!GH#mOLp0uN%PF)J%P4j+#QjT=ItayqevKncAUq(GIt9=V z^lK)mj3oWNFYUG>&U0Rz$=X*;8_gC%v5|fF&4u75`810oll@IA6DqFFALWqM2QwJ9 z9+?C}Vn23qu9?B}A;*nP-7t+5y{H(ZnNrrE7hC4qY&bar?Ipa^wNd(=9hQ~8%hpIv zT$5+XE^RfZfYc+LX1s=LONdUtqD?|0t6?m)7Z1Tgh01a5|53_tpmdnqV8ipzlxPUL%k5 zeg;F104srahY(UF1O1J837)Ou51v~_Yt*)%z!h1S09?CaL6WI?6sY_QAw+2D1jh3r z-tJKnC2?g>BMh0z5zz5W{C0%#T3t}2IPLcm8^=1RvdDQg*@P=@{_MJQ5rMNGzZN7s z*-y6=K`j|LRCzCYd|=Rmlr-7`fe-$eXQzfB)+RzL6G0}pbh*3{DgLCU-5YUb6iWE6 zRxltpf|zrC-l?Cu<01n0BoDe(lF|e8lmBEv;V-3~ffkIWfBY(S4wGmEvCgpI4E znBR>SYyV3+{Q4$cFFL?$xl3x7%)j=HNi)U^J;@~H%ejiMEhh&#ejn?W{vW1KpO+f1 z62L9koXZR<86q>07p9Zl_=%AP!`resCKE6wlMrXm9#3l+;)8_MQgsX)7K1 zYdby?&w{%_=$s%N{vGm2Mp6aZU^?cWFSca{EeRi%xt0g;Bx?BoV6z3SSy^sMJ!A4; zm<1?lbF^{HCLth-_<_Ado*yn$8kB;2tK#o{+IT(lR5Hav6*a*V;i$9ojo+;mgmE&h ziGLANXL>m|o+Ndi!kuy3qR!%_lA*XeTjL+wE}ORCtK@ts{fw?R*%)*cL7APT=N=+L zx$9y;ye>kCqdkA+!V<-mn97HscGf)j4B0q2>-!ypHrlLhzyzt&bN}S}yKpRvHfjwA zoAsFKtVYJ;4Dg$V?k5C?>7fOnLvp=JME53ExVj!tnoR6mYFZhj+k~SR^g?|YsRE~Ns4BHSGeG66d_RAwd9@bY1I5$iMu zSq@vyR^lw>Tk>73uocCW$p);`TrLvNV75ydB!LS-YxKKuX5R4eK2cl{>T&J2MVoam zepK-=635pEbft)~^}`MJn|BH~kf-NqGVY!({FRtDgM=WMT&B{!dzua=L z$A<92^7IZMYG9$|355L3U(2PduE^xP;;3$p1{<`UUF z(fj;JH+H~0{Uaf=rW0%UVAh*-K>bmZ;mHCUl@CTej}nK037T&Uxb)V7s6D{Y_Bb8HGp`f>l6TtSF`$W&l5Iv}!p=}#!QFwD3#Y+TH>o93;VHCU=cd^a=uwI&?3e|F&w`jXmam*79vqWVa=20vsq~R zpJRBW#1m~WXBUedIo4tpqy$0}^Y4g?n(2S*bMV%Wn_@KX!9>21G}|U~bO;ND%OIBZ zbt*E-FaHTwQ9>ZVQ$?UqeFSAB6^d7A)6HA$cC|mUM~v&&*Q=L!NX8`5UiC%=i7)2z zQpGfz8VXt4-rx4k0VXdF$}qOR{uz=ZrCDf2JZj~SA{;}$elOiNJ5@VYr;g)`h=UHE0 zV;AAtV&XF-@ngqoBj2iy1H^*^gIV{sb4Gbp%uzGa3-RwB5TbR~a7n0m3KqI%)?Nl3$Ua9qUXY z5%ail^rH6)3+$GOXV#3M|74o#xIvs86U>6Q+9>FJIM<>fk^GXgt)oNHA+9K=tDPBO zJvMvji9KXVZ`aOKnWw`9WD}JZ5W_+I42sybp)^3*r*xiJM&ImF<;ukm3=!7xYECRX zNh3HFKC}@AJ4#L0=iV%vEO!^+&(uj~9iqn?NF1yL-W<|6Z%;o^LpqD?DibE4kiXnTf@7 zT?^nGG>IK8mo>@vC4(bQ8y6l^Ho);ZjtE#Ys9nF`!@BcnMgWu1nlc2rXLK_g!}6Ow zriykW#2J!-=*i}FjJf*Nog<#fWH>fNy&gzDaHi>cXcgOnMJ`Fp8y?puD=RtH51idnA@xgZ7fh%2jqI3floOyh_fD2OGe+8>dugY zgUV-&*klSemL)W*wmA$Sl;0Mr=f5(9rEEK~UkI-BhBjEguzm%4!9azE$dW6Nb|%Xu zO9#m)ya8>S)Xh3%@+xKMd1#}XJFh;`82>gsd0Nu=3dw$G+L)c zqoq$0TANrO^?PXf+R4X1D*el&hjx|w;^9 zwkwdshgy0<@>l}h@4=dhq zNR{GM+(Dn4^f|@r0^VhqvOxStJvJd$co>I8>jk;4X&?tm8?fUv!Nh~#b{P_Q_2Eg- z*FEvf9HZS0pkq>sZOrf?62?wW&dhvu5R> z6v&ljpvr_!$#HZ#;Lgo~QMTZ)mQ^Dyro)gfY>G$ut{^ZK(bA#MEj&B2CgYo44;#_@ z$2*EHz>N4j{yG3;6%hC$Rbx;VjYSNc5#PuCtkIBO3W$~o(M1ak+sPRHOcCOw-iw06 zrO0+PACn1_j&XU4&8D$|_E8nB1U#7@E03EbUQ6CNK3z;k$J8ZWJTC+@+cWtGQ?46_ z2=*ezkG*B_!cR2ik4J!%!zhqF&TU$|Vc5VMQIa_65IOO7fuVt)_mb#-rbc zr5hp^b>&AGaMNpJv;7`3@MFBN6fAbr)KVgK+A z38qonAw#A8c-}l#5cQA&E0jd5se*BeF4HEcoRHx6{i*!v3t(=>`xpr?{TcR0s(SVs;TmxFrwT6!voW1d#P1KQX!>g2x7@uXk5;R0UCPAnc7F+Dkqc>D zy!b7JChennCnl;eu0FfS9&djxHZb@tH7f?S%ZuLUaJg6g+wJ+*Qu3(vY^mqz3*$N+ zf;p3>7~B<2BU3~n=24FXqmMOBLWh$$O_Q2Q{QHcu*IuNddAc%jEk~t&S=+JZ!4`~7 z%3mXR3g_teR@}E1h`OilfnB4O=7Vnki(LEBe6UeE7;!Ov_qoFv81|LIh>6^2##i;& zKm>RSB!BtM?d^}X?$0o|Islst?0TCa7fT<0FquCJfR$OJiW&=fac5=omH#g310=W4 z?_EAhv*|suI;`l6V?Hz;GpW}69ZilZ0|jfmH<1SY$a2mTIEG}H^(&5kDC78JldL~ z2Ik^yDuXYE`44Jgs@y`t#|nYI_11vn;3IeR>5v_FD__vO(?(d7JlT54Q>d$|!5EoI7SpbZ321 z4%G7$NnS#W?GUn{FNB+3#4t=HF3WiurY001CJ%U{pDN~g*Z0D!|24%S=p;Lr+E+mV zOU1U(V&;YYb~r?drUObgn$0ul0RfU0FME+|Jbjk)Ej^Q95I0QJJ0p98)&FeQ$wQL2 zpFQ*o7OR$>0yv3L+xcL-uf>awE@dD!dNbPE$NINYDqrf^Pez@}P1h^fjLA(Hh-cY( z@vUcHNKiBY?o1j$>}9kE#(1U-qlMVcNz&+UIR8Q`smE!M-8h~y@ndS~Mx^ymNxyt2G?mAw4jR=)S4)gQlkZ)W-p_>8o9be}QtFd_S8(c#FK6DQiB-gD^U`HM*F7$`9GFmRPSI@~+0w*i$b#cKOOzJA+ z{|QUw*7Lr8`5Lq@O0%W#alF7%;Q70q{pJu6<>q6XTsI$+c;*YhtNQET@_mG_8Estw zaoKph^3&lwMG$40jI}f&@5@v&RB*Y7cEh)UQ90nh`dvWkslHXc+tFZ-u{ABy&L}D7 zw3X#6mgSo({^v5o+KLXd|1_ihW>ULdDhKFM?a7FnSjj&uG=+-P|KG{xvG;wT^mDAw zx!h?F)OhdZ24VIFW}E(zwt`~%t+IWhLX^LIm8y=|wLa`$^ySRv5#8v+vKZ5x|4D#f zbA7V1Qd>KzDNaXxxc=KF1zSposhL~DRR2DxkjC^StCOPTu0fNwcG$l}ENJKK;@fWmY)Z~Xf3qxUttb{G%tKbF^6&ZEo?9`hFvorAK)<({!#A&iqG_|7!B&D z`Zaj-+zQ=D$I%g@jgKjx%8uNPW#RTfYe432O#j@nkIjRKR34w{J3uqMP@||(ny0fI z6v#4L3K3ZDF7^k!GV>bo@GAN{3QME?nn?JkD}(<4egPT2B--ZS3{$0nkbv}!E8+r9 zJ6~N&{~Kc^gw^p6cOYg`wd>j@;1Q60edA##KO6v$&4p9D4?v+A$A8LrvA6%bU|x#4 z4NE)cl$=3rJjz{OW8BIDML7fgeSAoHfZ}_nZd69C>YV>8trKuUfl;{7UZT;MC(R8X zOvmM4G63?5@Q!iZxLszA{P(XIK+bDTV^Kxq5K6+b+;K%mlMwW=ruNgZz`i**9{eU< z3XK12jj02V|2qT2uAsNk+24OhL^GRr=yqHl?oi_2+$I3)3?};C{5wT3o*j8|-&TYeHKVLgNv!z zaYjGv0#_2R<(tH8n;`ve)BvS_*H2~id;dV(0EKi}3mTYB(WFTZe3)(Tj)u)?@$DU? zryH$oc73yYAT^Nmcxu`65U*3Cjv=KYKRu%!?v&a*w# z4Yh5?(pXUY=Zfe6n4_e@reo^P_11n6YBMdHhid3`OO1fLR8ziMf5RsCuHz3kAv~)K}gM@@2At2oyGIVzhJ@gO*bIcdh$J ze198V!_1lIJm;LfU$q;efO`rz*W%Tx_4Aznoh3QUS1)^IlkH7YwQNV~I^S7gMc-Lb zC^au5qU&r(H-|EKp6g`)+jsviJWZhRK(3l4WgsRlnI73Eq*LoVH;K%olE^LDcYxJ- zgY>c4PW?eV?)|0o*KKlhY4cXNyO>a-UGV~(8$W6NZ#M9%j9R{lh@^4qpG#H&hA|p8 zor?4j4*C$TQOZ$z2sn>tsN;66>fZ6js0kV_tubWa*rsj$H&ZAl=#hq`k!HNt%crUCGpqS*j&u^KE7PYUrDp1Y!5Yy-~wFe-7k z(&DuNNIH`LkVaaQ0s6%Izti=u+B4txU~Kvj%wOYQ^FzDbkvK=NL$7t3^sI6injG_^ z#oFlLhO1-qBjsC5<#7IQ6GzfDS7B?+EeER2!6Q&24au|3&AwKXzxd+1Aoq zh8FcU!nw5{ErX_{!cXgmS(Hpi6y~*ox_PA^F#Zb;_l6{f#L$(Jv2M$SR@U1u0iqMY zeYDZm3GOwgmG*C){HL9C55-JeSl?GTAxY|OexP*WkOYzdd|g!!yO5OS9-Q(v4P^-yoT8^w3$ant`U8!xJ#`-jOZT{4T7eV==x7 zB{c@|Pn`g?X@(rjU`O?rh2Jz~>Iy6&$~lAMNT>MWT$Lm3VJ%2RGq4!Ya@v*AYzQYZ zUL3Z1U7*uhiZZ8cOO06pLOCvcUO{o-o+HOB_!7v2>HGbLgGLg_d^x7;fB*?2LP;tt ziH0~CO#I`Io?|Oy!ig15`QO<-)$K9=cD#3&^w|FudWl5mlQfsg7Ez+tk+trKU?C+K zZOR9egAnLFjFpGU-dc7}+1{LPfRR}RvJ5jA__)gL690{+vyJwFB`-!_X}u!d@jR>S ztfim2%lxcV{5X^PDg@t_POuZ1|5C>SS+hpFrM7;ZrXjT`pJLc(#eb9?v)c*g!sEm!G#|j3K2(W zJlsFuiu#3l5y?)*SMw`GVa@GNmrZ3Fw|R5%Y1h$NqiVa9DkiEt#Fm6riUXIL9OV@AQGGl0D(OFVTf2T)&= zvEm5=*9`-+9O~IYhz8n&6s@sniqXIe_ z=@L`IP6`@7&Ag_C=a{AG)>w=0?fJ>VIH}7VF@r>x4>pv$#HAr63um(MYg76u=9FZJ zOtahmWG)34@NM~o6m;Y#+XK>KMH83g=lK7gqya0}8i}aZo~*vv|F2k@s_ANXNua>w z50!q#??cI{&cdgjrc?KT0o^Vg_hDT)F$l#7_g#3lZO=J2%*wxAhX9G?(wc3vEq$Tcv>@{+|&vqNobaGgX;K+&=@ zYvU%AvPvlQ9g{lXWYo!6b=8UjDu79X)<2kUdD}GfdXNM#%b_p6pjwkTzI-W7ktPuU z9DFM(o%)zpiVXchW*G66LOXxZk|KZte5%yjRO6(-6^?rLt-P*Keuf5Fjs7it10nEx#i8>PdN;c6f72tJ50809CE-T<{AV54H{y;?DGRmfo1UW+&(|_ z39P{d1B+gc2iWIGsZY`PlY!U%fLj~|pFx)RSqB<=((A{%v9HQN`(X7^I-`}Gj$B1U zfEb5u%E!O?;CDp9a39hd-~M=a2urqFsjM&n9>#bz$q^;bn8yj(=m*U0fxTvb?C%Kk z^1yya4%w$IIVJ2^Tq=H-_1E&dCk0;BF~sLQ?bOmY@*x+%^u7Y*%HljfV@#fo@H`RuHTNuS@fiqA3UgnP1stfug% zJ*jTiXOs(vj|G1^W8xsuD9gE_HLvIKgLo>mEi?CT6q)>DcBNfOmL6#wsgrUKqEvk~ zNw&}UHc7b+=K(i&6Y7dDDLOF^j;Ti;ubZ<;oOPf(vi4#Ex!!w`co!Qfuz%=f6a?Ax zg!NCCY31$u5FNW}-R~=njPFWHxb0QPffvaSIH$cduN_Cz`LvKMyMWPRd?K*Gv-Q$Q zS~a^oXCj-{qn$4fBz3!;%md`RSWlE{7UrJ0|Ur^Tg}Uc`$*#_Jg~_s25VN`WZ{a ze(BFjNN4Eyj{Bp9Wb1QWf2y@h7}KiSN1Jt;;WfzRKpn@63ECW1ompI z9CrLOk@SX0e&nb`>*;^w!e8cb6yBV$rdp85g{ay*X{8XR)#rO%x>dVWaf(G6y?*Fd zw?BS9C;?eLeP{aQoKNx-f|j3!R^ES5-|S~&el?ck_S#Tf|HuiJgmOv@pbwf&&_hq2 z>Q1z@|Jc!a(;M-;oTJR-_{T3YTe4_I#Tbxq1odZ}eqa-yxLJr!e(Q@Sg~b6;JzDoq z8@zBf6uN^&HDaEPrSjMKA4RyR-QgwDPkacAc^XCp!|9KhyAKs+=xuCLSAztXC z+)%?Hg79cQ{P7D1aNa~;ff2$2VfcZF$YXa|M?Lv$`cHB`t zMU%Hf76l&&lJIf44~t>{k{`+>MGWs!qr0Rq-N;UjPplumS-|%-0`4TbU&1A&<5oHX z5Vyrni3)G9CC8A!7`=8rtz5*Zs}_GpOi=VCC44(t@R}(3#Qv=R$d9!uI3pE? z$f*KH-FSfElz$}7ME}E-hP0GWZilR(x#Yu3@^jj(Gn|57XV2Nx=Q#48dy5yv4|WQj z%sl|!z3m#1)vebrdWJhT{%vWG_pqP%3EkoPXoz$7v=SJ``dmlbiTWGG74~5st z0^(*1aJ;r`5|gX(O&KqO-NS3bK_QT!p`3Z}HLWD5TKH_qLlFxy)9!bF1!(nN+gSNn ztDWA+W;;dQ{>Qn1YxLVvr3y3_ZeboG8u>CCgYO>6=>8pJhGBQef7!azum7w~DkrM8 zL4x_V7d9m<5{A8HJ8Wc{gEwF<=?tXpIG zTr&@40Lrme5~g}9!;FANos5RY{w7b}$wDYK>zR}>KD`Cj(`_R_jMyo8f?+Jlm{Hgm zjw!Y2&z~y1Bfgv4H>HsabXO_ZxJ-Y`KmNXp+k&jwl1zIhELVHgD+w$;lX%l`?)}xNAnby8}%Q&EY8wwTQr`vV4vCc%$AWGPK z)M{PE52h(bZ~uy^ct6J>3k3~9nhTTex6>o!V+@TM3EAKXJlnh%|?@?FNs z!x!mO{r2dUlkbbPxY!`T+pSRortf^OlW^9B+O;k9)jBZiJXmH;jGP`5Cp(W)o~p+d z@Rj^>F8mK;+JVyi*Puv1B^C}ZaINet=2XES^L=)Vz%NBujEdhu} z^igS#AO=?!Ir{F($`|w@eecR{J|salX7IN?kHoK3SGfaNx?V_NQKtcT6c40(St|Kr zF#{*@DNh(c%k4&rEYR^I3-Y*I&h5YWdCo&}d2e1SQk|`kPu(}B zv=EaYk;Jfoqd;5WhahUEUW$&GBhccZE+;>>Q7^5Q6~Rd39|KH`8UM;A%X-=#^2EMl zIa++VK{ZG3faAq_>J^%ip&iP=r>K&-EB>#e2f}S7n-DpwAo1fDQ@W#AXWq0<+)M$2 z+D_@nw4v`Khl?98e->~)ZeTc~C%@`nmSs){^(P)u6t01nsaT+#b|-zo+__BynaLTn z?>5E#HmzS%(kt9)z0x+Js?$Y9;xvhg0n!}5Ffgk^xAL}`n3=j?u&WA|ox=1}FM?We z00Kg&Jr8#kr`J+kJJ^YXxB7dWh%JfMG7 z@p$L#aq&)@5}?@{=m;2>-udV&{I(mDn;E&?`W;P{P`c#Z+(4^qFj^SFKs z=r|7z=*7=e88#HB+r?08DJM|Mqse}Kk0r4cL&fY7L~D$%4oS8#(T9Y5e?vI92g4^< z)+4u6k(J0+rSoRn=SV^=$QX(c38iRyDx1xp7MU^_lZ&CYY8)|SyleazcV7lT7End( z@KGs{H(ll)JAqv1yRy4YOOTLhdhLCldcxot3|bMxIWM+PS@kSt^^J1TSqJjCLmQ{YilGcg`)xi-10(W3YDf3Xae4ZdEn{$N1vaI(^7@ zI~WZ^buuIgpP}Vd$}kT?{~_Q=!1kgO77ULA`bQ=99UVOT+czPh7PKAIV*V#NgFd9I z?><#NV`V3g(;5W6{k~+V)CpIz#XR%m11Df;i0lX;u>kvjAW z!=dQx?*)-*un9Zd8zPPYZ8|u?^Z1HpT|tsKzXGP2>k8Bo5)hkxQi#o0{4$V5Y1*I5 zIFErFfn;ALKm}o0M(1MKJm~g<#YG{U6}+!kQlJ;8CD-EGHkfaH_&&5oJl$XQkE3+_ z;wD(5nk!EJ7etZaF}+l0Te}e7Ug*YYU!hspZ&qGZI`p;w72k3DomeiJUUqU0eveS9 zT0T8FZ~w=pv-?J}^6j?4S?h0F{`As>eV=v=j$=R#auGwxqmC&f0b!pGWnyE%}oOzV*z3!9M40iS&L3oWI96|>C|6CD(!QLMF>;o+p2*ae*4#@)(@AgTh|xeCtc%QAncv#*)~C5%xRwAn z_HC$gU7PU;EW;%nZNh&_Zt?B?Ln-@uU)!o}+7i#!S3VDkvEsu8cHR9skc(s`Udy#W z?%8fA)^iatVTFVF21gV5mbK)2x>c7^7JN7%F-3NWURCbL38#R?zwzo;<)Qe56iLrp zdU{(OU;j4uO<3zuzuKI$R;=`O7keL^NrqU~SdD9i?I!}eGOl--UV|Cw)0^DR{iDyp ztGD%L^1X-{ulU^E?fZA+kzEe-HEW!g536E=WY-2OGU5h>w};jm@d+>X`wiZ1$y3T*ZeU!l85@8m^6#tGxI>}v(>_{O@ zPmG)s^Nc>w_Rl45K2EP`{~Z-~W~-iF#i#n8*b?_1d5q|pE(S~m=UO&&?MvG%!)5D#r>5ONYM$G>4RUln~}3x2ed@`0ca+yGet72;V(@>fU_jo z<={NN!_DQ9 z1Ij?2l+{CM2f!B@i^ftW2BIZ|+*oQMUUGf9Orq@Rxx?X}n{FPkPe$|h=*NjJ272xa zMn{xL+6N~AA@UrVYdhniy(&<-9omHP-Rh9NFz4|V|B0bI%a*j!V&1_|6qYS!;(jhj zkXg>bSzW)Nz8tD~)(~B9LTVt1I)0JbMf@5J4?L?7f<7e>w1$yb%Qyp8pbrT__rzPB z6l#$Ec$=tSwd4s0i(mh7Eo;fn{@ZCn$b`X?&ruW~m zu2xpK+=Q#iDHp#cLS7@wlx{$8L&vur3IOg*hZ1Z7L)cyUy}y_ltsg<`mrcaxvMvKi zY1c{1;lR1^U`OX8+|Af35+td~o?6jfxX{LKO%6q|y*il5Is2jqWbR70fawJ5fZVAO z_fJ0h-$SkDnDpIydg|V$M19&{Sbe0sKZQP?Kp%_F)Lx{p<^Y~hnHHFunBe9!g3-6` z5l~-y~2DWn;L6Rh=!DmsOrso`%`AWe z#QW!>0mYkC=Yn}X3`=z<+&y@vEKdX)rFaHtXq;Kp)qgu#C$6$V!#Pc zK&L%)6^0iJzd(4iO;+tUw*Dk&SoLkG)FVYfZ4z$XUf8=|-&6m|zrk}%As}`pE(g*;OJE*$>hTLBV84}R#V@{SnFa+#gtp2W0~Y>$A`PC zWA}QNfztSagp$ng6`vA~ylZzD>qKu{@x265UO0exXuTR`4L|foW&!=fn^!yDytgqv zo}i4mV7EH(vq*RF^-(djdJ!Dri!F3gN?N9WBD

f$GBB>ixy9dlh?v&LNRiLF!Pt zQEu*+0qk9JdbKBX6$9z8@#uEn)c=@xyyPGvOoXekXy@Vw4hv;cWZM9|>pY*7TMB_F zHKij}(z_he^#$uZ>MoFT8~=a4j+b$~&CVF-Hp_70FYZ{LBG0oA--1Edg*k|>xeUo9hXBd~>rmubt4+9T!Y zH`f871|tC7OSs$|9eOb^>Z#uep9Xk=0_LQfieIVAp4Z*9eH8h6SQxj9neJ0B1P0 zr-RH#$ES>;8<9M-YC|F7(7MDOY5b>djboWd9X*a#4izPMs7*>d-xvs{evuU?Uoa0O zS>CZcdhCsxQTKmy2lIa;X8$NU)KNyJ(cXZ}nkMD?WO;ui8)5?lh0X#g&%abk>_*va z3upn&NX2j$h?m>r0`KBC?c4-_?-_V|y-r{K2S38JY(Plo;e4H~mO-UOZX~8MpokYA z6G*btStXK-klS}k+P0%}@E@x*v3E6;Np-w>XQLoVCyZv*xoM`^_L80emqEI@wz4-= zRMh)AcJNHYg&|zpzeg7wYNPyE+KQu+>GYhqxj*hdfX495B*{)`Wg^cs2(Cdg^+zQB zv)#iys6j>y$r2pg<3s!(@2-zq#(`^@e33Kr*dz04Rq{vj#eZZVsBiul#ajNpEl3Hc zJJJZ}raKh-{%#@w^R>rT@WuqdiylwF;-)}%%z|{1<}c5rCMtKf<5gxrAB<2oM!Bh; z;~VQ%8Lx7D-6m zZCr&`2jLHPOu&fbh%V}MB;_KqCU+Dfv!I#*r$p*`ww@+cA@OTvV_-jGQe`-3j>bkuV%PofU zmT^W$X$m8$TXl3Jq$l%!2CqhD`yb3)WzT>7A1#pS+CF#j*6PiRN|ULppw8VZuO5Ms zXtJJ%cMf3l3b;Y@kJ6 z(^|YOv#x@O_}-jC)L^iGu`s*B9sN(MB!0O7d^fA04CMaqh5h^d+s(bTv>}eZ|FsNS@{Qq0h}Js7{^~n zop%Q%LRVodz{Dj^7K`ibYxR*Ef@$*g;+uPTX=&H8P4nE733kmYtg&QBS+a=y?@&*A z1rS?G9P0zJZi5Wxag)9+I{HDx?BL!IM=w~Q650{}k=+QGkuyaar}U@eerYg6IyVYA zS7&=TuLEa!MSAu&wM~j| zuOTUhqlDb2Nxb9{N9#b+jrkH%7Tx=46cTRqn#8L34KaFD-Vnx0kO6-f*3hvU`QcEo ztj|02Y+F9)v~kQE4->7&b@q>Yi^!=A|JmI?Jjsu@O!IiKUP<<45uc{lS<7g}Yn0AG zYXSr^XIMDK60+lmpv0)U5u5NmsyMIirQ64>s(+M(%c85t>p{G7*&~(@c73m2to{7cp-fVGBslX$i zIVo+}Ux{D)0=v!Q-sPfWPc5Zrf~h;2!)xO46jxn0{sbIX&7Hr7-Gx`}wmsg%^IP-4 zez%(2-7^}Dc>z}gDc87=0WYR=@%uiEO6d_;QTgCkz0_k7>1Y@-4S+x5i9}Kj$JuEQ zo}&R!nbIbk4M3%Vh@>Q>%8T+)@8~-Y>m;%-3t@$8<(0_Kfy=QL@Dvo!!*p@)PHnTEq+I_*zX-^W|X8lnf7qGB<%nR@+r zRTc2SecX06YtSJ`9UaQbcLXS=Oq_N{GkzCeo9{I(Jp72Qn>lF%o||LmJ&bgt#}E)O z1%-m*<9$xt8c_H3IrIwREJ-77!6F@Z2jzeG{2eT%kar5-t?ofBUCzu zO6+{$5dBoT{d&<3FMU_I#SVeYx0Oyl8sAp2VpTOnj3MmLWdP-v{5=w=X!HM-%W2en z$vQEI9yI6}aKEI4rIilI)usJzchVXZO*$qzq9Ko&qRs`!n`vJ2vFUwcCeXBwFDuP6 z!Z?978@JwIDOjuH&uux%fQLg91j^O}Fw3K5;9Whw*$%V*(B4=o94GNrJ!Rjt%qukp79NmE1lTxnoUVXvSJK)o`=0fN4l2#rA(in72<-}o$&BKm>QMW z*{uQb>rLv%C)Byo%htvmZtK4~sUk6ArY=%RA%hRo)n+|@-$;pIK;<#Oq4|sXn+TSd z-M=vwauTFn2Pj3ZgbH~`~4WP*<`^!7YZl5uVSuj9k zlGN{p?e!D|Lll8gK?MdCsw{Q!;Knjgkm#t;VTuy-0t^oVbwij|yXG{v=@8UF0Ik_z z`S=se&%oV~T^vc!s^@J{m@xz-2+I}`zyFZxlSot%D}i`ne0@E4 zq)101sMy8l?CX*w)}KwG>?AAB$$)stb40DDw2Y3nR}iGWNU5d1_!oEiTLYhjZxj_e zI%M^|Z!pN6C!V0{Tc#*d;LuIWP+~H5;**jKg}g_>WKNMm(YG3xkpo`6*CA(~*3mVp zf2%`fW>9{_-Lka!r&;j%(v`tXM~+ql)Cbz02C-@%h5Iw`4>Fmwx5AcP9dp;9aLR_{ z{9(TbUnvB9FVQkfX}F3*s>~2Oh@0@45!M$>7c5ROnqq6WD|q@{*VQQTr3a)o(asgH z?X1-RN})1IjHQ}a3|i6i!r##B$G)cY@?f#%9vfw>cEJO}c0J95A}k6VfDY#~9yVFS zZ)y%J_3kz!#vPDJv7(bSA8|oBbRdOdY=N4E>FQQNo{91?Z3=PG=ST&2bbLX3Lz|VU zcy{`h!^bTRvp(KH@0tdh4lH2w(N`@1lfR%pKQvHj- z()F3o%`Kp?tlY=NWk(Sh4!8ggEsi9d;xteaHzyhO_K?z;_xx+I#rO2XTkgWiqGb zg^w*lD_rU#z|qwx?Dbyh(+=3hA!5s(G#hmEU4#o)C5cJpz!>%JAQJas%M~&5*x?81 zct|Vx@AxLTupQ>8adS;edp%eJrUl{}f$HP_Xy~D9A^>E%M;i;;=0SLa_dF5T2+7BiJj9h5;LBny>aH?;0D9< z!epaMP2^XIcYS?y`+X$iv-WSI?}sJiT`&v;_@A2 z6A5<>N`8>!!~p*5N0Y~c3n&%*9;T%C%$rt`FGkva@f}h4-)zW7tbdEiiF$V!sR7UX zZ;a9#F1AFza*EdhUyv!6gE|d62Q)*#WAJEl?weQ0H|LbC#o8znRPg7RcQ!o!O{4X_ zJ9(xPK1N(frgu64Y-i+}Hlbg_Q0Q#hZZxVI-S=o8Z0@%Zb73>UdvyydNOD5!$yjpv zTyqP_fHyOhU0fX!;Ihp+n7Rc^Xt~8f;U};cV^lSu&2%kRF3*?2$TwZRK;;ZtQ~YhV zhXwhvR*M+&^0}7ggGS3EF7Lmo!pt*!`=ndXXEmkCIu<7rfWIhO?NgACOjkcE3@ik` zw@t~x?`r*GiYxV+v5WWfDKNecm!@Np;D38SLd?D3MbJuqfZNzL?L*5%mcQ5D1@zxZ z)Mvjn_2;c zd4FGtvDcvqjjuzPooHRy8=4G;uIZO0%F4hSKdhM-!3th+NoCy@MOuTl8E_&@BOTFWQbO;p_TVTE|^q4HLcpLqG>S%LHN4x(obIiNzr^ zvoa05eC?`b4dH(N)e;54zZkSnWBl(5`B1JVc!{zffkOHz1-S<9za90_ke|)OMoNVH zzZaDM_kxM-{}ut*692ad|F;PLcTf1gtKk2C6=Xdn37ig)1n@M{DqyCs=3T%5H8741 zuwG>Fn|6Mc|J%L>#K{C41IUy2o4Wvjgvyi#j29vj`>km_ChZ_#GOWd|FW^1Q0s#Lr zQ;C4JEXQ`SFOTrs@}KRGz@)~9m_uX=vLE1-JP*X?Zd~_jKS?>?KRF-S>B6ARuCbje z-uHbCSV@ecNqLM{lLwHdN-Alq;S?^hHCMp9$-&)zvsn^7S4K3ZD`nsa3OjY~C%10x z6WE#7((tPyI!RM+;bB%`pJo|5XSJwW9;3V90v1|9&Owz`6~GqtjlHUcne>Tz2vfm2fh z7l2WbY9`fNGptu*!TliQLfd+Vk}!lRNaf4*_QRwt><3esRJJ(%e)VQ)2>YrJv`s5o z1%2lXaBtDpI&E~fWRerpt6cFWtxvw?@!JHZxq9%cBrT3f3n#tsIjR}p?f|@_4z~Z@ zp-Ec|X;0aRV?{pmzV$d+7K_{GIzs?rj@$vvpb_qtFD*aj@2LaeXL*u97hQPR7FPC7 z6QfjMI>)=pQIN{m4{YSQQhy3~aRMqMKc188!*i1NcR}{K$c-!TP%E6}4mDzYf3-0M__HP`Sf7`-eW|Q%2V! z%C-|Rk}jeDPYVFFPaSFQJPwT;VLkY1IU0X1X5N&#i)$eE(j)EGbp~JOe;(uo6ZxW%FHQ4uzy51|r-K(!&1m z8`iCh|BB^`s@gWAIB4vzU{mW>Af737mzXG1g8sew!@rYg__;uqET6W2+?%Ze$=lWv zFeB{h*<2dTHP`EKjv;KK;V1PKr&K#Tl)UO#RZ2O)S~ri9?;#R)*xO7lksM+;G_nlI zX*Y@68qwFV5LcbK*oJD3S?4ib2@6jb`N!nfUwZJlGmv<0BPbpWIcbJYF#n$DbV1EP zH-LAQheP;@R|k+U(kcmX>3?;Ny>=LJgm{2?b~GT<+0`~5rE{^Rdp=lBC{aeV8e;-I zB&N<3t)99;i9l8&SL~CZyWUD#?YOEbXN@WV zsdzfnrOf-@LoJyhA7IZbFn&%$y{}>pJ1=vQm`0f{xRi*`CF=fLnZdFv746{%1)2VO zDu5LLa?&nu<@mn}af+s4DtH}xL2_W5Ii9OtTB+6qSIK#60D<~(mGND(Q(IXcFdlEQ z{{?RB!AtPFxSWZnzBE_i=i7Q9+Jpao8PPOrrCKOgfr0oP*awkjz>&(6+cq6sYOoy6 z)KkU*&-s}txe4tAJRkoRgRxI(0H@LN_}=`a%lg%HBog~qKajo(^_B&qKN^Z(Q6A`J zny&kAT`jZ1r4KE8*T@=RDV7DYAMzBwD<%+Qg(ya%Qylo7-fHU_T{+G4TOJk3jO&#v zMa+%+fH9u-0YVUGTYc1ICRC2n3$ZRjJBp&VM}q7)02;KOQ9ymV&>Kla%Nnt17et4W zbE3|#AO@ceZtmxk1WZ|5VldI&-@PhK8p00CAa{v5??ErWsS6+nc2V9clEhyrn^v|D z01#PqRg#D`JpfGCpL;lAeHR!`*j&WJM2;EL$kKPGu+RqZv5Yt0k3`qJv4Rw@IE!J` zAhxTDSvU4dZSS=Yc%tp*jsgR3tChxY0ZoqvQ|Z&s7+XQP#=qnyUHDa;g$&a16!fKkBD z;;c#5JQ9L6puoGj=cSea7HFNd!EFaaH@yaH62E``sG%Q|*2^}KILl0k#bhz5An)v# z^U|ujCQ-(M;%%$+js>vPCOgwuWk9Zr7=@#)#m{d_fN5yhMp-*Yxa-i?asod(??<2_ zm@rmJ5f?wZNGd`wrmE$n`JMt-i}^i!)+d=xJt}TPo}mQvEMQ66y+6mpHsSlfO~b=J zXOgK<{sJhfrW|y8LtPa2pgND^mmd!A2uq#(#8?jW&+gO#0n6K8ha*BeQd3N*QqirX z*t2I|cgCbfZW983Uj&pWW#vWlZ+lAv`PrmG5~?Lcn*L}~Xm6%bgV??In^-5AoP`Jfdrn0WlIeeBt_dTv?h@su{YDU+QDY299^s)6@^WUbT zxkzt8pX;Nm`r=S0L&-G;nO#`jCuz21(QA-#ug_T``rM^)?GH;0jeS+6&wXvvM?Q61 zttzj$Oxk^&V-&WU&yDvZ2-`jHj~jS43XPMHES6)D3i<&?VG~F?Y4gmViYVGbCnM<( zkUfSywf*)x%?ImACiy#96!bzQ)+eOHl{Lq0QpwpQNwR*2zD(LlA$S@*p|>53PJzqo zEp{j*M=nk+tN)88|2O()!2?Xyy?5+~My#bp1d4c{1&2i|y#r0qgaknVpkDi}-x6NQ z-Zt4Pjk`&H62ippz3Dyv{+%by_tvPkASwY_ZNbk?#&}Yf<8|h8L3Bwnv2z!n zf_?!u^bPpWxnCJinC$DOY7K#N1D`CX8c_NW+F5hc3^=)prdkXuFk11d7*F$0R-Ff^ zMb?n1*B`}pF@+hv?s_IzmWnI1O!ZBppm6??S34AdsOOZxw_X>LHt4nYc`G`N)pLC# zkbaxcR~uY~HO>YGZdaW#0a$1YpG2t1->j>EUA-%)@;|>i6}VdYir-X10@{Ak{WSdCFv;Z-VRMUjoajW8#@6;E}_b2a^2 zO=_=Vd{UlNh(1Ko{So~{k20}oCpI9Me+HypWNX2t&`ND{Hgeg0JW^-Cm&P&Hiqm{a zzbbGwuZ@`ym}*m&N|lEcS_21u_~S`m$pvl3!}sqPvl-nB7TC;=C85eGB!_GRsvpD; zP#_7$Bb|b=mtL|`TrA`TtFC^G&Tn1B-sMi;-FIKaY^{3@H)(~FmEl+A`!Wu7*^%Mu zN|Fvo_2RgLNy3XBy9scU{M*z~T`h03H?Ha_5^FG-hACXE_H2tb)O(`CsImKJe59%h zYe?D(V|9*L^a`HN9f?hq_c|Sc0)mKxet2+Z4q}FSFAezt@)%Y^_<5$=i7Th7-j~yo zLJU!uH=_tB9?6M`j|riu2Kgw~Y>SG-rO^rM=$TZTb`2}eMK=>MY7SRAegfO1&q?S| zJXFABn#Si#R;RQ0wG5L3>B1BgJ-NOu*as|kGzE|kz32G8ok<|m=Pn95UGD_;vG~WT zm``Qawlbf3AJVVkB+%fO7kgWJDzn6gtWP-xl#=Qe?wXJV?LI6zT8S`XmffN_7>EzU zD5+jI;W!MR?%oSwpIVMmCM4i(NP2Gv@k*BYd4;ouAYbT|8kvg`-hgVMol_~r#6^m0 zL*bk4qCoVVNvwjh+FIF_K>Z_45X4xVId*Q=hbMO~TRt_U^k1!bNbqHXj;WFENnxNy zwTZ9MrTRp=&D>o+&tB=dEx3Ty3I3ESe_vu5uB*b2btK_`9kjXOU<{U#xJ zkzwpuKfJG!*WQe)EWFnxU(|5Y|C5Vd#aw#4A3`K`bA*;ysWm=)QA!Q@yO#00Lx?Rd zZ)^9q8_F6!;NsGhg*={%R@k`I(y_-F{@zL?e*vP}8?DtR4J=ebxK$ zmw=u%gMIeotCJMdBpzS`NGA+xzsT*?%o*-@e)_=C;=SjT5V`nqGeqadavU&?hR4pYYUQdT<6d~KprXE5K}btl zj3kNsj6(7CsVqYX0y!YvljzGI*EvFz8G$j;tzVajc4TqkU<6rt zHIy*hm9Cf;_erH>*4gjY;Dep)_~rva0O|oV1;tETU7Qw)_DQk;kj>!^0E}va$_FAg zoQeg3`QQ6O+b1`UD!}6h&`35EGBd2d>YphL2~a2A@~L%E-~80#M4u87nS0vjK6sHp?_;;W!pNGVh?QZ zT7Q0m-J6?hAy3!O!zd15@Ggm|1)4k)g)*T+Pu6PPe`{&ma$huczw}PD$ZOzB8K7-( zq6|phc}KH)5Vtvu9$LR`C+i;m=8dEqQ+*Bxj_{6)HtuJ1O{gbkV1t33Ii1C#mSSz~ zOeH|{Pg9y*_BS6-^zSvNj4hw4KlnN}pUl?6uZX5K4p`h%-Y`?nfMpz|qRD{Pt$G#c zE*jl*a|I);FfbLTt10C?(K49O=KBI6Hj=JltVz8_If@fl5TkPA#3Bc)I-}bEhqw3s zr~3cnhf5OgY-N*itdO0(!a+nx_6i}g=dmK$^O)HoGekDW2nlhl9P^NMaN-chv0bm@ zb6wx-AGmJUFWt@$)$N?uIj`6A@wiX3;OVD~!oi=bV#qW_$Jr`$U)2disvL7m+jRki z!sc~H=}?h7A0+oRJRhdO*dgvcyB?0KkSUYk2@&G2om496(Wd@@)JaUbJvw6q`+a+| zO-06`E*4}dNyq~@+>F-3g#MSnVZ0BjcQ9}0-BKoc5)7Pcu;?(o+Jm1b{yAC9> z&}dX*GtkH-w8`k{pjTo9AH1N3sGAU}trcBC0+OKT>z9!voTYm4FW06ldUp<)96p0|8S^teKvChUH(+7gdmN#tpK5u69Y`YzQI zgFY);9`5!rTQQc$Yb;-FQaLx_qS%|L%Eu_}cs>B}212#X-R7e=zHR1ujRtTLJiWB9 zYdFM#dpiv3?9GGnU9S)tZ9iA|%5wiH*Q~O$s})7UTl2TD)%W{;w|8zeFk2eg+ei{H zG<6jt9xuLmr1qQvk+%vOoXV*Q46SvL-saidDwE9bGvLRH8NQUPa3EZY&w>DONn}}= z@ZCjURI8W|`yxv;O1hY|?X*)G-oE|4!HGqiE?FDRhiGkYPS6fjjLCTD99(j@`YAMq zNb`c|Za_auyP(kTAtW`+bo}qx&g?gb7eS+Mwi0N+7k!!;RaK!<*{Fpb&7Dp5X^6F3 zdN+Gl-(g&GvNpO)w%70}-z!@Q5?lo;YUw1D!%~xN{S1fHl94XGt|UV|lFtVt*jlK#r;>lSDMe2SrNa41u ze1ixuYDGekL-v5<#PL%!g757p2~+rdJF(Pu)K|X*X=z zblE>KTBqk4RDEBf5M|Rp?*58=cVi@{416mIBXJhuM4_9`J|iB{ewI7CA-oe-?9lPi zaI>8T7O?g8$k@4%mG*Vq!5=vZ)}dIZ3S27sIG#L0`^&P%8 zS1yRTMU?KV%-al;|Ew#Y^J6<$P|w^go$u*03r(I(^gU_IQLD?}#}$ub%>W&`Tw>Az zyYlSN%8#(Xh@JdZG=8%lg}{|ZKeww}XyCs=mas@C(r2G-SYblqd!|mjWns*5k-yPB zuZZOmx(}~REuw1}4v(q4e;}9vZoWM~Z5OJgi*57pIl*w^EZ4ru;zC zTblqlA-SVv4XTy@FmZ_Uq#I+;KKBqah>by!J!sMfVZlcW8mC5^_Y;3uh&Pyg>`KSI zvQY}xLPk;e7O=hhL>K*oA&k3c4G%~5DQumGk~Z(uINf80rjo)y36;&_{qi314Ay!(2)a|j7g);7eN{_Ev zI_pw3Ai=|skDA_fqbk36cz%}MR0K4IWS|jU~aVSK>Dc`cmBsAi)aC0#gY0)~4Caqd-<_qOdEnvUsxBl2-yo{5)5wz%q}TgqdWlo(lM720|JH*h&*GV6Pz{S z=S!wR@$ZhHe#>3)ubVw9#b1562dV-A9pVr;m82qe(icF@frxhYA~NMF}0O_lY{-&qVkuz(tYsY9cDI5h;OO}{{u6fCsLqP^k> zpUOpTC&+QkIQ$thmn}{+aqzesrAu_NUZ_&RQyPMP;W23-MDgnUlP9r@$_w*F0SD3xK8;Ra5RqM{7M zHB9@dgf;oAb%{k=x+{5y5S%lAAWaaSQm)0U*p&*&aP67>y<^B#k;-z3MOc~r2W=w@ zSDNL8eS!jpF` ztucVi_$piG>`&AK^6ez-m%lRZSH6Do=#taH!~Y{5%!S9>DqF zs%CBc+3)E{rn6uC>1G95t8i=*MR$M=)Ox;1p6AXYQyoHW?&qW2gXF~pcv;o?f)jBZ z9;yTti_U-hqUosVLEuV|!sLz`P`~K^bsDXaoE)?Thcn@~5Y6oaAZ|%GH8jwxP>}CI zDWn7)EId(R`Hu_ywhw`&X4m#&pItrx>$|NVWk5j^-79bZhA30^WYc9tgTTTiF}J6q z%CV+U>Mv?kvYx{FpQf|9|AzG4y9=hQ3ohjntdR6neGk$kqXY?C<=&QZIyd8hc5@{C zecXc*NE`J2&-W2~OgDrwXoRB@z>hyNnXQHQzjm%Bqee%4b-v@QWjq9haMHa`1670> zlXVFR?ebJKI`jf*Lj_u8s^Z^yKX~I}OfhKu8}wCukIDsUULc!0vDKj)z0l+rL66T* z3a(C6fHY8S^W}E!<+kqCR>9>K?hPzMkKBdsYy^FUdx&w~db&NrJkEv}MNk$05AAn3 zyIz%pMr6pBt}by`ug{P7b?aq60NhQNdjei6Q&E+$ee)K1U2vtDloqsI2QP|&mc#?& z^7RXs|F2#U852VFqsUnp_W-IW5AeAaunr&wSOfn$l;uOV1Jzl4rsy(s9G2()E_H?O z0Tuv{sdvEp0`NS|2iN3}f5Pce@CgvEztq+)(IPcY#ZDM zB%VLhfmLJkJ_g{xwhBToPei(~g~!}k(&#VWEP*HJ!YDxCTv|dY43U2LXEHMAV8kW} zka!%(*y;Mj7h#-KxIfY={=BxKIv;iRq=vu5FnMB8%s-TVS^r-`Pj znzrpfwu#ak2Vl&{j4Ob0@)NY|@DJe0^D28_)eg`t4g(5%-M7BA>@KK-GGc2PfEdPM zK&ZYbho>1%pA^M_o=%}waQOIh1^^<@Mmk}`oO{1L>O=zOihI<`3XP!aEnAE55kOp+ zwE+Q&0rtw;bH|~89$kH}4rt!J4HU6hh`Avocjh>kd36Uyh<>u!H=8h3vJWOYsOMX7N?`w>5I$+ znM{8%5zDkwfgyeb2jEH4^Dz~QXH0nJ1BUGgurv?G{$_drFmo~Xh@8Me8uTp$Zz6w9 zf$k&C>uHx-<*QKZ8nleQ=lQv%j6e=zygMl$%=jC`vuP5v62tRI9#4^ietxAE2uDG`!43^vb1|-{cCL;}_5V)E%Pax~mOp|r0O^9=Iqj6sz z%mv`uYVa>sT_KN)nIQ84OYXg7xaWvfX->`SE!|#&mNM-tt;+cWWq!{4H68zk)QwZw zgXNWf*(S;RO3Qzt8l&FaW3BBx-JT_cRWskY>2;|6GB0u5>K70aCEvTT>RkFnI5wh! z3gj0%cq?3|W*o@xrS*UvD|VKelPEByD;Ah-%ZFMevdW*fiaYg9MdSItwWXEd9~VHc z;GDG6I~~{`AH1(+61KGf(0@oY6D+dLB{&^~ADzi1fL@iT2~e+nrbSbpb;HZmy!;o_ zudR760Y4t*Rizq;Z8JD{w-p6$7F9g2Y2P*d8D+HAycmpIlq}UGqP|PYlZ5;$N|*`a zFfIs7d5GU1OwJWCV)%iGuredmAqR-0$q*lO$8){GD1f4R;jH=}Nlzf~e@2qfaQ|%q z{1gSz>}T4F{#O^c(C7AfiO7i4kthi@Xvrw*C_`US1%pb}RghNbvK+5~s3(du#Z$97vINOR2v0rAb2Ynwy4ox=2 zI$wWMfhrl6!hl0l=FJB}>)A9W;7J6GHGs^EVqaQ*9YmhZlF?MvKMrS+@a zB%^oLv4*QB-d>b`Crxxhh_Q~{O@ABAE-d=Jco}t zX!Oh#*uc39CRMC_{1rL*k-(QT8H+I@t=!zP=hrQzE9n^nya-6UY~|u=g2~lV<}B0i zN4|e4d(NmftwK)LlNNy|k)QZC8HG5h(&A?`yWd(Y!G`C5pGhuy^9=MhW!lFS9?Ib{ zhGry3c%Hc{p;bIVj4h+6-A{#5mb3(-Ww{w{kAsa71BQgjp@21Tur7@?g`g*~MGUeC2kMIKW{SfzV8Br%M$bOmCw=bcYl-F?C3;J{59rrZxgO3{TIXUeu|%R^wc}S=#>Prd$K7cl%in zt#z&de@1}|7IthB-b0*>R{_+_s@_V_4OdVAypzj48l;q)l=P~_*+0CC-{laAnVqY- zaoyd#Z>3^Am`%O)=7O&YVT%nUpqW-JZ!88w7JvX^(qf{OvlM>@{6qgW!E44nCpZ2u z%V7$fTR2atxqF`T?UAj3!k0J32G@Ux)G6E?=tfMXnuf`QnR*06UygE0?&ck~HA1ZZ zc`2UF!aEVXnomUIDk1Ig|3?dO@I_+iL!RTYwID5cD*3zuYuOPQ@zR7Muubq0$p;|x zQ-^!&{!$s8e06A;n)a$!6D&dX7NX4Z`T}z7D^C;&41z6|-{0t*_RW@0Q0>O^szUP& zyq&tU23CCY-f1SYjcg3+qD|9|#Fqu4tM487*7cj3w2IMV+g@PkR5M;Rp#CZhU`ux1 z=vubb%fsqhOLfOmGl)P=;U!O#b-tQtYmfQkH6m6T@9i1K4})m3pli|p3U4X`eoM(W z5_<-O51qa;Z&thcu3w2SmynvWj8Jaw|9t}w;>lzh=&Spz8Hqokq4k9+r-O;)PEhuN zfVJd|TPWluSIN=IAt9-6IK zSYt!K^Dp^5Y=r2#0&jZ(L3_$V$k~q8gtJ;eU*pO_! z_BnH*@X^;pf?!eAZf*S!i@HMu2BYs&NU6lv<&u~u&gy?xzp7$$ z-c6tmvhrK>!HH0Z)y|T=Uwuk*r$e2wFtwbwUA?vRTrP%ar~uVpy|f%C$KUcg2)`GH zvMBbhG+40MbRWJex6+0^MXN#bPO?E<*m47~MBh=bo)uppI3rm=(@I_R9Tk~~={ry0 zA~i?RVs6$b%M`uy37AnU_pkPk^jchPoYD$ua``VCq?sU9q`b{D(K^{mkiBxglY5tZ zB8R~%B4EjwqKGt1FR069Q(7f*b%vOm*mWq9kj7nvP&)FL!D~|X8o%5$z64(aBLW`e zd+Es_aj#Co`&;E<<|H{an~9#BV_X#z+O1U9A~Nh?QOX;5{2hhZzHmhac;81s$?X;K z@bg;)<`tu|+$~L7y5-UT>dn5yO+A=bnN0x^ zxG^%JEaBj_A91;7a3U~FT8W>2^u|SjAB@yITzcC2InKAKYC(z)5~xX_3jIWX4l*2NkuZI%A#(!0Z=bvw?WfJ$T)VV&Lv zF!`T;bTWY^$Ltu-3Pw=unp|E1eNqb}u$jF1i0P*sQ=Zo}h7)|Np;P}()wI#>xc}>W zG-L_+@VZX&VoP5PGT5fAzvg->mvD)uz13XP>98+9M-^}3H6nrD6kI<5T&cQ(+>FOv z$GImL{XY(YBBeV&Py$WHK7>k@$T^w>$g?VrayH&LrEgCu_)(^{zie-IzDz8e9IAFv z3#L7^y$gr*xIkE$b$1VbrZ`3;9Xl_V-&5!~4PZ3kpFX~ookYLrSo&zH?4I4}=aMj} z0fOel=UTVxgZmn6*K`G?OCzxn4V|LlW$CBco$9klm?FV&^~{p9TWVsvF3k(x)5)%k+-rKYC`{9lj!FUYOGI)BE_ce^ke-0rn|9bYno z)-Z%5Ma(gd|Jx%7VR2Mh1JkY+*K|8g!KBt)q9u?Qbrg?kA+a;AV)x}{Yl#7i9blPu zrn+%F+F4(VXKi_zA{T1VG%-YxqGCAMMVRsKpM>Igmr;dRf~1rxLHB@&4=~IyCJu{Znefg#v+pKTr7#upw;aytHq0<6=ak_C?Y5JMOMfTeL$;6>jTRlY$?|l)V=30IVdw2=G9`qwO z_mfm_g$XE=DeLfK)RK&{zc`{S(OVwhdR0M;6HYW=N-dARZMhN>sx^lc8`Q$@8=3EK z0%^D?HTWJ7>|9#LKjc@Y!{4kSIyOH_^_OZ?xj`V zCwphM7Z!jxW`Wp(&t<8N_vU;S2)U?uskBo-HkST}mlMV{_gnDsGgd*YlT5vhiPice=6`#aQZF=MVqb!jnjV6Dx+<}V&a?bg|8~cdJ*DR` zhS=pAQ?iXNOu?jMU#oj^MFxrGw!7%{a6p2(4)QSPy@0;<;-N}n4#}^!SV4kSF9InD zre{ZH&6EF__neqT1ngr{VDGi&B@il6@)NAMtYa2o3= zCwduOdwL)q#8Anqte_G+x!ht=s*~3y&A2RH0z|n>Kk77co>OJEk|E7Iczf^NH%CT! zAHtfT250A9Q9`>d@q3IHDBv6eZtZE3wAqT{S?k8Cb@0~*Dp&2r3t2CrasKqlW^LZ}4hm z2~$Vg-FVPg`d0k?I2f-pr|zAVR!B@Ov8+yEUx+ey04JBk3Drlc&2kiN?K0=`*c&=O zBq90e4H_s?i+foTrwm=t8Wx8+#QFoclYSGMgA7N-OBRff2DREOl)1by+16aW#xQ8) zagkH7{A#=yG8X>vd$;392%;kE-9s(v5}!DkpPL!=kUp_&JHvFZjaDWc&4bJIj1nZg zpY3xkbYEvs$LBP;BS5>aAnK`-HfQb!?n#^FLsHtn$o5;x2tY95MTOuQ z%_Nc^_;CjjiSFw%zY|VXqigb9i8?wzl;G-hQt_;3ob!{5?HW(p*zRkF2n`o+^vGL*;w3x5~b1k#NhCz*Krj4j>A89?7N0KeXaWJNr(b7L9&OJAvFWF(H{ggA$2FOK z`CX@>4AG}({$h3w(fhU1I8`~OaK!Vm6I%&jj1p z+XlY=Ea*dbm-8AHSzqnwyzp{FLaE*kfG60=V596>LMyI0Rm*7b7G6B^DB)DR<@F~u z1;*Q|;*M1=d|5PMWb&>~N){5tpCmJXczxN06DL4mZwdAB0X|%d^4aWrfwT5u`CbH# zhq+Eul8i*p;8N>q&&SiLct$dqu(7vqpm&}pGZiyxK&gWIAKi27*#EKDkl7)bdx!W& zt7Sa%(^O(mF2~jje(C<738tEnN*wP)IcJq_7}wtUDANbCY-;3M*ZImY(BDuAasE%0 z1JIgiC&XH(In!ep;+`Hgqr^yxHHh?B7cf}x=u~B`oO+Y%qD;U_Urdt0RX}2LH-NmG ztRxaH0EU69C+G6Z4zX}$@KOnc(z6 zKg$PAJZr`pMB`@*&QLBrvMcgby2qaT!=f8X5};Z)*~;$SyqwjbL< zQI)R(DJn^!KOMt-(9;$MZofmei3f(KA}gIWDgPR%&qzml@{$MgXi5NP5C=M-+)LC# zs8HZz|Ic!%QB$AP5|$XkaFi+T-qIOfrfgT1e(29wtviNDC6n`N)rlwN3w%7#RD|8; zM?k6c`?rMqQ4it1mP`nN=?^v#B9gL02aOvkzg8_*PU$dQi2#(syv_9!0H+Xt06L)Y zGusxzahb+nHA&iPGMVe$C!1M=FeBM9M;>90=(vLtj}hUGe2t6TE`1!H zJC;Jl%^7CxqpNu4r%mcmUb05QfI(`Sc(Vn$kQ(G|KlMxiGITK-FmTocEs1So7`&Mn zS07Vx&#+cYKy(o9{;bWGEK5f{Ks9*XYbXOHx<08L{H)e*iZh?7v8n9~#N~W3no7#t zjXJ+tjv~wVyL8CpWBG-ypMdXF8Cnt&^)9+Q8cUWHX&0kW&1uuq^c|!ZaynU%Zylqd zkJ+f9fr1*}-99{)jLgw`!-9@XcG(MomUQ;uDVS%k?%bu_wtI8KnZcMtA8q_TvfqSZ z%{4%h^8kPT#0HG*$Dt9>P+Y&UsfG4%pEHEUiByaDe1QDT_W zP3?+F3?GI^b6XZqn&^KEl_QG=VfZxlpVeHGD4NiEk^diGFj!M*JcXTXf7t>5tDcsO$^@n#<1c9GE*$Q~}TTnKr2f^1POxkFJ_@2A!+` zK~)Ls2SCHnu4#w={y*Pbzulz+V0{<`;^-z1eINL_UK1riN$V{1Kpx1rYghq5&5$<9 z%(#EkIjK4_Tbv3yvO5iaLM_YTs1x9M;9weh!qmM)P#bap_-NvaJNG|uvEyOukR3dS za0J}HU2GR^LU%~FS|3x_!qx##j4XD_V95*!fB6eIkud${1K;q9&3#!Ng_E^gJ2IGs zE>w5eEgruCk7mvrjvJ>&B6tD@1~Z@Pgs6CW{)lIw03-l#6uq?+^_m885Y{NsF}WnP4o__=~+Vs34E z$#h;*Rh^a>aG+=J$gqM=JM;L*1R$(D19x)kNdx6M155s^34ijW8cUmz2msVyo+8nF zLw)nsU6mh*y01l`-=4oj27+`rM97c~vtz%)8vvs}E#6_KLc!7-VWJuj_ky`~pnz(w z5N)1Zp1W-pzO(l8tImnl9}9R|qS>b``UCgR_&7I0{je9jwn!nwzF`k$#qVhSgF5;* zx+dRGc2{;=UkvXqdczXmHun3J4l6YvvJS2fvv6KqEq9R`d)o`wRfAcc>ez^-R10=me34!e?2N19n}$0Z07M zZC&7JcH9s)KxzN-sB62jiggqQvlF|9a!O*N9-{i?9Gb}OYaS=O+EDs`v3Fr_ncxe& z17|c_0v%S5!xJSsWgRBbP#?}Bk5LcGZtZMwCnm-DUu{$Z+UoF!S3e}|rijb}fR1i! zdztWJM<6vc6Yo5GQoZ`9y^|T2E#bUTZ)_@)`EeC>ink--aBJANbh|0CyDh7)hXh7t zYnV{aW(}p?YN*zSrpR~G#g%07oESURV_*)K!0xsmwvFUUd)<;Pf8PAcB34QPvQlv; zeoeEvjqAp0BR<4&ozQ>Z6o~?EvyO_PrWS$9ArCpAIos$t&x9R^fnnCDYVbyg+lzve zfP99jKZl(f1#0o!KkM?JW-?kXnJ)o3w3hOCC|C-tF!S`^wA<9(z5rM6&$+RPoPyS^ z(wyD3laVuzotf-(6xDiqXZ{v@|CWs*T31{8;dD&>DNXG%-jd{P{+XAx@U zLTyLsm~p$ud=)ZaA#kT=3BPVWWcu{|Nlp-6Mahk57J7SEQ^^Z&ZxJ|(FGT=YU|2Wa zhJ<6eN%sKnOjO56Yv!2Z9LzKIiorYVd#lH>h!3^WzTxUMBt5(ZKrn59^Q`j;tERA} zEZ&{xd^;KexDY?pQ!g0?)Ha|s``ruk8g`d$70gR)`{HB9c?V$rf475_2!d^$%ko2X_^^gvP!g2DZ z5}WhZ0z>U#3}CXox`TZg&+{(6J99Y$-Fp^{m+|k(GtQ?;$62v@^if&;M*CD3ZDT*j zGh8TJPtfhs)jW&0`*XS6Hop4Q{X5WO`Ht=NP#T{ZK0Y(fv=zhK%&T6*(j!+OR#}eR zPa%yhEI@xJCW5$dI#DUJ;Y4uE1N5~C&KgONP}Da?2Q)AEkIutapT1vccyvvvLsi*7 zQTJF9)?Kr~{kUQUJ);%pp>$i0@;EQ80yTl|W5CXuXJ+50COa<6isQGZ^Wh98v(F-` zjVCI~m6F^I2!B=Mr~RqiNT&iA=dFyJupEd3z$Ow@0+gRNUb=TqExo$9iAYnuzr=s3 zKy3nS@N0M1zX!4etfg3W0{J%%&qH|)ELDP6m%wpm^+<9!Osc7TKIKNt=}+mpRm%41 zK8aNm)){eh~sX$Yk6{M%-4_0@z`l$>J+kgepalPi1ETxO{)!*NQi;UoKw zxSW#!4gHv~+u?z@J7-prO(2kOxA6S&uY6nymG1H=Tv(vZ(vywA92tkict>b}_=`HW z1lNzz7G2<#(|-!*o#G_lv0MT7*z!<{=fypUDM=ZD0n74QWEC;38hs&X}Xka7H)+J580gP;PoxPu}joqCIV zh!Zrd&HS-pWZPh!dLBFFxoMhHI(LkVb7Bz{V-7FdOd8it`icB)sO_QJZI zoqs=2IBt`o4ukYJ#w+Eo-7S6*i@OU4jMyy{{vn>`SZcm4ueRf6E~wOgP*KNm^cq2z zV$(1SUIyg>_gY-22J45e0n=&rkFPjlU*dWv^Yq&VK}q?uj@}bL*BqE&ieUsj#vaE~ zUGF2nb7P5MJ92Sy0ASs#9aacI`Gug2epS4lBOudB3|lRsGLg_zzLmLJW-dj?IA*9$ zOev+R|L$GgG$WBnrb6dy9lv56x0H33rw8MI>+rqhO}#Z{jmO@NxlVlsP$@Orw@DBmS3wU3hp@7?^KKp;vz#Mksu-L!hP65H1euka_kW%W1?;x+%t zcTHtz6_GGn^u4rnny@$3o*a6bu$4B%!?JqdqQ~gHmP?YNt+@?vcnObkhF_R{S+d7Q zEscmsRw`bD_5DjKulT^U?hiWfG*V`Q&L1Q%@=jyvTDp8eEF$V)dl=4hy&kHMn=#7Q z76R;Ql3aoqU(?_Q4lyFk_NaF_gVtUXMI}MGEjeTDfgFOb{x4n>hIn zlG&EXa4Hw$h+wPMs})2f+G$qbCMNmU@?&3OQn9VAC=ljZNTpj}QdaBcnkF>(&2(KG zru)7Q$QCC5;j$mZ|-+sD+SyTzyb5S!M3)LIp_85yc^;Sau7$ zT7RoO^NMCMGE8r<@+{T6Bkw#%acGPuheqkB@PC!#9G{#n_%aL_XdQmFEyr2Y>#rfeqDV| zK!bZUX9G(%W6IQX*%B#A^&4dFWqCS^V>txNI_p0(qo&GpZJ&kJA3Q?(8BSxD=cva9 zC|#jJjoWOsRenKE!UO0De6)J!@FTx&Bv z*y+3EZEq3x64PGpM`fK(rlNbR|8@8fv%NllA?WLdg6+LKakSGp8%kqO*J$)o{~s;D zR|YiA*P1&I(j?*Zkcn2sY?x3R@WX$Orgy#w1%exDoM(p4_5pC+!N0AS5Ye(-qp}rD z85$Mzzt6`4TGX$#tbJipc%Q6e#e8kbuj_2F3ti#GIrzM`=NFFntI0y*hyIzz30Fz| zJKJyjr++Ka9J=QNYA>M@1Y6UN?U^irWcb6TVY?Ewbg=(#=bx?WMW3X;bTMEfrE&Ha zi;<1|IEx@lOsBK>+)6VeIRi~F=?9*GAC%tV8rz}+vyQC6vFvB(E{sal@4R&Tf_vJy z@g^c={i3p_KEc>Lj>-il=^;McK{?O1?&)8rf#5N#0$uz{ub-sF)TI6(_)w_w(wygr z*Y|esH%(zx3Z^*Fw3hhV=Iyu-aN{w7mkGLE@Z^=var??!2bn*@PT>|4{GGw2OpB{bech$p--05jRz1vqYF*U`$~8XaO?cNkl6J8gByRcpmZHd_>3v zIy^Q?4UPCaH=a5eIrEh23+MBWTHdnpVs%BMrmwbXG!OK2Ky*y$23BnprqBFO&#y&L zq8z;T#}X=TB>HA79N9;0hCA{qGKcN8q`HFSJ=|=LensqpD&XgRys`8$oj_HoCZV3l zRZd}C>-<*iVa2QPqKrcy{D?Pz)LX3jefEgu4Uz=M((~tkTn>Aap`EFUz~7m49O`oQ z;&|D)^m(#7FOm*~%G&MANCBRZ+QAwHFErWk=NglN-1SN_+##A>CGxKp)39M^Xj?SGsVwiv)d~<0#jY-<*s3ReFvw)i@ATsk3N%}L2pcQ(3cM(y2M_|hj@Q#qgGB=nmDXkRjXdTy>d!4nH=Txu2|s0h`d zL&vhv4BZQiiLG)kYbT@JDzO6JKyd@kea=H-MFWtR!C{~jDyU2 zE{uJNkotUBcn@5w`@S4PLdp1tP{FYF!5Q3dsPf`s=kghbX=7C~yLi+m4k1iXN<*!> zC1S*I8UiT#tjwCg%CT(FQyqEN7u;L-3iGp`K7sB5dH^K-z@W(28dam=7~Her3aPou z+GUKoX_5)=x7oA8>FJ?GJ=8T8t>6HJxVEo&5v(=r-wzWo>-l^)4d@MbGMf!t#Ltm^ zDDmBsR?~!v4|nO1>4}BLKt^~hvW=dOEC7N7#{CH%fWSO@ey%u}g;z(qpfIo7v>+G6 z-GGYORy*)C3t@qQ0lf>b&S_pcYR!Fl#w|c8gZQq~LVIXO^L1aD1xenF%hL z4j!1Uaw3Rnl#{VZh`e@n9_N0#!~!n5PK|3bO!aG9Mk-7ucmL3C+~LXeX6H}rFYBFq z)T!F@%a&TH?{L6*;$!bjByOOSYb5e#-iU_G80h@2KW6&=Ek*{2w^7b6)NQs9jO{ZB zn$b_0%FMD)KP7=IIK=~i`7bII?LmsTt zRc#(fz`i+Ayrt3v)dawBC4ZhhS9b^Z$}?irNHK{lkEXmdj#S+Z*E>0h%aGnfk8XpS z{trJvbMK(%l(X)c3)SF_rWDv_QC$8X=kVs0qNFsEs|UE-VUiE9;Yv+GI~5L8J;E_{ zkhn_P;g|@OVo!Vp^c>Kta&^0NL%Iik_}fWGu9+9u5~Qr4wT+hWY>6}nzr{PCwh5W6 z#QzTXxbdG*A?n@n2Rcisf|JF@$f3!zXL}2df20_<#21LYNnQkWt?U^x8lI1CI&JT) zs~6`Xy$I4hR)6f2AnIh_-ycdvt7CTBe9$!&jb}GVD8t3TN#bv2sK3I1ZQFsSd+giG zjz~fk@off%KP*$ygGAd8ICVDb1lut!Np77d8?rL{ovR=Beu=@&Rg>g7MTfRQ10xz_ zmRe85z{tHGv8X=#{Z8u$TAW(-r|ktg!gqLnZP8+W>|db&>phQAX;jDe(;J3}a;K); z6=vJ!|DfGY(%AL0cOV1M*kqSp8pY%&FK##JrP#lSH34Uh5(9h2YDo^w;KBc!z`&{; z-heVkvp3cKq$T_PlF?4QkT@tnF+_nj*c<-u$cA1TA;MUjLJaT^Bb^U8%^kl^3Um8& z4vuY^_0-MZ;B->}r0vQVb>x)(G{$8(5zX&N&!x`;x_S1dHzt+ZCJwaZH%#tL_O(H* zytjQ4od^_2LcblqO{0jB+@^xpc>h_+r;IxWP)OT{Ij;vA3*0xtN1#_hnYZw$vtcZ|xvj z7hid+m$4XuT$s2xHDW)eWlkYl&=H^tqgU*5ml8Kpr_AcgMF)urY@w%&z~)|J8Fg~} zx2-!m(brAzq2>-GvF186*kkFu^_*IGxUH(n??R(uSV?2SDUd2p1Cy8kDH1?|?^m@UD*kKX{F6 zI)g>V+Y!L|@912vrRpZ44gehQG8NE6VIBa5#q3(@i|fF>qPPL*;bqf9D*~_6TY`)! z0F|3e0WkDOT{|f72tNV?q#Q5cbJqUG33WTS};whkvz$eM<<1?cF zod7Kpk|V%M)UzvBaEMC+PJP5me_UqsnIAOm1gp|WNAz7@&(i(~v!v*cpDqAxe*b|6 zAUkEpy5fly9pGr2zs^SzvtJxUp#D#CTkB9)do2$<{^S5vds_XYUvj+zU!v z96sx+n6>1p*>Sw+{AQhNY{vIn&65dbt$vw0Z$O1P2`K&#O8C zoIG{)q>vTqgRra|AVq)y)ZbH}W(H`lR5Xhdzn$Y$iFWuoMQd#^km8f?T8*!@TX^}M$h03<<0{%uiY6n) z-mfO9YVMnR+787?RQllY7(*xMAAC~W4nA)0J;d;@57#W<@S*sHKjaCfU6-$1Th}w@ z{RI_`jxGsbW_ZG_*GOP{BZv6|aAQb|>B=!|tA&sxNvx4*vG$0`vL1YH77nX7lL-Fe<2q!p-M{>76l z0;+=4HK1Y*1xEa*t?9COngH}2C#83>0G^PxBc$} zr~)W>*QzM^#Dm6@l2T+tTeFp-@kiIg0on3La8JWcEyHT-Ir83?Rdl7-00YeZT3TZ@ zc{iVXbm|*kJi@N91Td5lm5S>6Va|i7_^)qpjwP27d;;2Hv9EN2|_}b&_INd5DSm!DM93(M!?W5q9c{^?zWmTD@hqib!JYFl#;tW{OS!VFG*vcLZPZ~*vJfz^<9xH45For^jXB*7#o&O|@) z0LLo7XCX~gW9BlMv&!g=^7>>E%c*rO`P5Bw&|MD5LU+aob^!Bg$>mT5jlSY{xy6Q0 zE!I8Z=_8Mbf1s~pqyB7@F0ufQ106=Ugi_tFqrLc5cHEZXCZPALe}8}7#JGOCPFN@u zXj9@m$$h(P3`*WP%v#c2E%Xm?|?}54ob5 zoMe-go&EXBnV@;gUX8!tK)`lIS*Hi2*E+dl`9qz0 zOXW6szV+|C_l7>tzTKIr2m?|3l+L0J4L#5*V6G`s=Ra4@7*{1h!C_d~svhrGUMc0U zm>6Zv*vO|bv7P1GuZrHs#F7;oXO)+iJC;O9*%TFtkUxBws;%-N_CpLg1J^^^hg=M_ z_k7B`3k&^aldX@B6oqgBm~Tkz{+Gtu3yCw_`vhmWr4{@_&iTar3Ux`Jev!+5Z|i2t zW4Oq86uIO0`XBsuQXn2m#M)UZb~C09Y2lNV?F5K8W*o9yF%r&}rCCVFYX zNj**vR$P4a9rg`lA;G+vf{3XKZDQqi4)yjQZ~DFjR?;{@(caq+c;2PFL|wKu(A3BcNi^eva#H!I|q)=SA$N zstD3qD-u({%uz9!F#(g`NwqECc~8h>67IRV$}=*YEc|x*aQ*TL-wUQ+e@t;x^xl4ZDP!zu*6_6?Evp7{Yu(ggCtBYBlW zZFj7Y{dVBXEhm*g>hEhU7749XU`6<9VqSv-K|XV+)?2vA=~4cny!=+6+uX&$MOwNG zHgZlS8|cy@N1pG&fKvCCbL|c<4^q7c3k1P8qG@vavQ-c?%g*- z&ryn3t!_voq%QxiK5qw*PZZTuwQ=GmE4_m-gL^eJF~Ej0Y1Y0>jngFsB+7B`KvP%c zj4tzO=5l#aASro?(zjZ%D@T@I&>a4{E!bYp()mX6J*#Sjs%7BM!JxS&i@}1XE=#}z zRxH+P{{53$(m)41g0H$9-WL|IsP(9I?J|Dc>M&sP=-ZL*(ydpA$J;8Amsz@)X>rVMkX<9XR~L9@ByIfWb|9Q!pz>((ukgG91PQcn{sN*Ex8?y5Jv|QT z^>3c;v$V$&(7Kf|zKe1CzuNoqs3g~~Z8SZmX`W-2S*BqrnU$qwngbPClBOps$+{y1 z$}MTb5(7du+c5SvdyEAz41_ma(^B0%x^ZEv-AaQibYC4~Qp3WQ!d4{XAchDj+n9#^ zPS*C=%WIEbFmLBiR%dalkE}X+({&A- zn|p$0%%MTBX9i0x8}&1E4LtgXIa1EX%>5vogSXq+Jm_YFxYmuy%We^CdF;p9>#7NG zSgoS`$%tFP50RDKi$w7Az7K>G^BVspyt3g1K;%kqY-s(f4M_-{3GKX z+8lPdBc`1XTiY(Sy_~WACC9vnLzmOV^U(||_CToOj5>TV_A^5<_Ks^rr=^&F!ZAY9=@%GpwkcpA0>m3K zoSqK7<1Xy(x+*o83lVWXy?>@z!>Mm5xxyGiGMP#TK9aARcOHGGmIiGo-cBt7%!C`XH-RRD*xUM9q->YJi6@#e!YvEl63Nk$hV1Z;0|(2k!rYk+ z#fRyo7fZ!3E4VS0E0FSo!n1kKpfuxPARk$A1!6e6cm z`Lyh#6Q*v~qf(JZtCxRTZQZ}t+EI6vP z?0j@eP@wF1M6tU~T&imV8=aXYUfBVkktMV~d=uEdIJRZjgbzQr^9-iE}&X_Ll{Aq4z*@oUN?~q>sW$(W+7MuB4kTX%*g+1W6 zbixh0U3k4O#M~hwdXnmw3>D%CSG_-l?04V?!`*d-QRc9R>{*^KIGFIFhSj6D zeX!46&ipr5-`{gfu6?@N*P%C@7j0#5n@*+A-g~rc_>!tBMIrSETnj2bMoi!5%WZa= z+^5@&P^t;n+wbv{i-+v6aVUSm^hXUKc}6im@24$7+j>}MUJWT_NmZI(AO#&-zaf<3 z0&mSZ7nIE6mFl4rqVb=px8BwjH2bA6oy}Mc`qycbQKHIh+l(h0w}oCfg#Uiwd$mjJ zU?_+F)au{@0~3=Pz8JN)y$`q7*Y>adTMm-WFtj!)dT&zv1Ge#=fo{L4xpn9&525gq zie}8Ji8iubxLu_@0j~Tl+B4sU_!D`j(}@vQr+0Av0~h+?eA1p}i%H@2L(ad;fE_-d zWm>WSvzM3Z1hGP1#U#?e=)M3+(NTrXpZ?9!ZzN3-=>vqb@1j-E6@#<)R!}+h$L|1&F zBXLP7Uky9-tgJ)rYo-iOB|Gvy=&zV>V`j3txKh)@Q>5R!(dO*8zVtFI72Q&nfcH6_ z%rkR69~Uq7UwIADZ@ys-H}_;NYnKFpm{nl)N9@|8)?*e40e|Ei8Q<+Z$wOTm#P6A* zd}P^Lcppkum&JrH_x`SrcVfa|d79O5zbj;y71kI|B9eqnPbBfTY<9K$u7Vov6VG7! zm^@#TR#vd0&Zf~*g%z*zgkipSK2qs5uB$c2_kKuJ+clYT+xnbHtpxLk7NxJ@(svM5 zRh?q*wi4DSa*8VIDoV8EYAN7JFX$H}Z?m=Xa~(;<+m%jnxxJY?vUaoabxy{DRx-ac zrt8%avQ@Sl{8WPWZ(yIkl=$2&+~Po|B2NErH(uR(pou$p?gtR{I9owFdhx#9_b#Qa zebrE|EXfuqp3#d!!en>ecnube(;oQkzPfUB)1#Zy*@a6UX?^80mwU5Rx7GXG;$;nG z{&Z`GTB$5lkFL>T2*ks+4>Mc$5p1cMe{E; zD;gLFZT6!)KoZbZ_0z4s^svPW&U$#+++q*w`Lg;J(F~5RD2@oYUvM|oBBTp>v;6h7 z+9l!*BQtZ2}(i*jy{3BpG*!v^*L2rM~6vMo6L&QA57+!MBd=WzNxKujLM>|L>ydF6b~Sx5K7_Fq^4w9z@@@oI;CCSh1F z$eFFxmPmNgk><-@*Bu_=CmYiXYPF-_xnnzL>PO4Z*J9UyI{q{FTd9c96X2fB_G7d| zF>jo|nYqu4hV;J|)WF@+l4`2pXk_rqaElMH6;mYS2xow$>@%xXq4hr=0mEU=TPIDx zyxWDgTc?uv`)$it*@RRDNA;-+OkP8rG-t8sks;9b>Bvt$p`ZPTg4HtBm@4Ix>NSvKYbxc;hj4(6-R(DeIqdySU~plX@eVq{gl%M*pM^7X zu6oX{?S-uY29$=tsR|I;l>{X~f&0BMrAY z0mrlk@S8~NSphx0qyXVfn=)oy-1~XGps@JBqc;I=>Qq3j=`}$WANGK zhqz?4b#dXH8Qo(D>Ufwvi8AzB^oa8X$_@vts+h9qu?_C_Tgo3vSb84=k0IYvc`seM z`D}Rba)`ccpqvzB?`rmNhz)y>-|%Xc^rG9*?btUF2PYv)g^X@ZChV2x`F_1MZ~--H z#W^h#aIAbkbNBVR<_vrxupC?p! zn0H6ESK*?Y@&x3p7B2eDhKMKE-z(B90Om`7o%2(-b86uI7Eqzd)mD?g zFQs;t{THn`g z>MdDh*V#Hg{ABS$rOEcqBv$P=v<|bu+ZNGym>aQdonvZrS~fmPU+v?Rz;fTXW;P(i zr}7)JJ+X?&6;_*|wa3RtV*odH1^B^hm~W7xFi!(mmie7d#Rwf;(xuso7isN^LuHh) zmFC6G#yl9-<>z1UrYE)bZVpZ+LnE)L&ntsrW#*bei9WLNjd$AvnBEN=U9p{HLpM#O z^bKEJP}B`Elio*!`(Zu!q-&sTuHxl!-!PtJMtiX8cVFC&W9+tBBzvv}<2NV!_0{a3 zS_er;N4-wJ%8t{QVB{y$MY-BXtNG}ZDfL!^yPyypel@0LG}tQUx3E%vv!C}&hYi8- zj{WhM(sKiW)!|rvmxUe?nPl6?rPlT#t%camP>mc6u_*cIy8Z=i?r3F}^HWedsx9S^ zz>d1_>-mW7Q-VM@w=>>^!7dVB^e7K6mf+v*5E|W?C9^tMdpPjRKGvN6M#O?kr=ZmZ z{bX?)S`MIzzy7Q@2Bq)0WM+GGd6DDR4+R?*3jUdz?Flu4zE*2yOK0Nj>w{TJqa(AZXzQXrbF~K-+HRzl1tJ(buL{DNZ@8kqyWgRQoTzonoMIurD4i_1!bf{rphl%ddXrCm zVoe5N`FqWT(H#$=*#1wm;7I0X+nyYWWKx1CG?dY^?-bjq);HejIan8LeJl5aUV}?Z z*}lhY8tWFAW`XpxxnLYTK>iDrocraR83kXhK4lOTdb~~zC2>a79!8l74^5tXM|rA_ zAna&HFpSt)t)zycGupX>r`?K6uFn(0yO9Er3+qD1Qfk~YgGaCTmq1|?b>U% z+W2UKN1=+?O>@g8BNPyuLR084DK>pMjit`&7$RV9pPaiqm@2@cc4~@7Sdq1h`faZW zEcY_Skm_$|8oC>aF)vmp$pBiBW&s?O>kvQ_6oTpeu4p^(L7$ zYWTde$56XX-8-N|mO^xfh&l#>yiNVe9On37roZH2;30+y^yE#)CsFeyRJ%HULq^qq-w(V?h|bH?tvP=r2Y5xd4z>DU@b;C`1^I}GLa_&l?^ zl(R=+MF_qJOYaZ4Yl*3*B*izgnRbHFmE3)5n-z}G#pBctkI zFtCfqi6p1oWFg<9L57q37^I398{jLwX%4fv@!{H^KyYm7DD`1ASvYp1{|qIHSEwVJ zTBB#O`sv7lWP4ENZuv9@+cYVSO)=v{I;@?O^piEPYR@Rz?lZ`ECY#^AbTK>+fK`Z zH3N+CH{+stgl1?Z1>@yxYX$@e8~bDE;oaON`_f#(?~xrbZMBo;f)+)D4CLZ_Y_&cl zOBcUGN@sAB$_o@^J~QEJkf{xG9-NiPBt7@*P4aK(U<(dzkctnqPG1%J=mF*hb? z6o{~&0<4$5#1kXGCTgU1utAe}K9L?hAEsLVkv~tBh$TY`s?}rTzpy0}M|<>ie5vKI zb^bFr_ed1egox#V!f4F*;c+p)uH%)+Zpa8&@c4I^$fL0t9hgV0RZ;%JF!?mr5Yw)N$FLmT_skzc)rONtm4C_XzD+3kQeHw!k^Z0M8b6%2_ z#JUF1Vo$E_J5B}QA3_=hW^0XjC`HpV<sDy>8a)hkEh4oWbp~gGcqCV2E~?P z;C{teaj2Z8^!>VmIi36-C@1WO*K?GOP=G9nfj}z7)4h<+y{Ka0H*-{_ux?$RdmY_4 zU8*f%{bZb*BfaWMbt>ChTCN#n3<%U045IpNaD4#+go@MjKEc8xtQVF(lms{s?w^zv zc$Q{hKFo`-xTJi5hBF~;AI=jDGxo_xPVtTEy{ZDyvT+Ftc+bNhtQW=;Y&B2zs{W*1 zPq7@ALS@HV;TRB+JY17$Sw)owxwv9v43sLzg3>ywo>e;wK`!cn1VRj653|*KCQ^oM z|5ydx@hCBPS&cUA>2GLM-Qskf1#6Cq0Wm2wssqB0<=TR%y1X&Lu3zBPGhJI25ucKvHcSboJ`=gY?VpAUSwlursZaAEBMc!?POCpI>GjW+trWeZ zn?-^^I$(g*YJst&3W})Ui|(aK300vBQ;SkfTc`uP*27Y;VTYk5%yatV1T*?=91E2TpDrhI3;4KIjy&v(L^% zAD9Wo!lPq(fJCYNwOwH8oqF>A`l_95Q%@Z2YLUnaHD0|N;g)m&v9?9Uj&>rR z9nL(JL!SK*>W|mHwmt=jU3-3{-`f8-gm}nuJ4$TQbT#SVo_bNoE3cE3tDjOf&*WxY z-du-ivx9&m)4LJQ$dw$KKPMGrz62;?(wNw!H`G>zzCCODvO{x%SmB$Eg_`@K!|AMuAU<@PlA(!^!&ZdLp zHaX$@L-^gbP3a;(%uk!omQz+V+=MA~Y0&%?JA0531|vX>V{oSyEptIyd{kK$mX`AjCeL+VE00xg zGx!Ey{M*rLR6M4-^);AB)^ROkcb|BOS$wc6&a8-DyXW)m%six+>Env z1)Un{anV2~=Wy-&`i%_C5ZIwBV;dn}Dh9m%YAEI03-k+GY2B72+HSSOpo5mJF)B{b zky1PEW8Tw{!YRPmiG((_{OW{e{d42Q8pw8E2s0~adFBYQoL9G->Wf<9*)K`^j7QgL zI=%IjX8LohOukdg3}^!krPcfe<4MY8wCj(*XD6)V+OMwtb?b`MpMvP_k?y+7(6p_4|KfmQK_l`! ze_1+S*@*pQDTiSFMq9W$BlFXUXx~(D#iGQ*yE`tN2u3Ka-!d@fN0dpr-glMJw3xQa zdsqtmnA6;1dL>Nw+$$*5KqwD^Fnx4w)+t>E8`MAh1!5-h{EKl4NO2R z!$oC91mZf{l}j2Du0&|?xM$pNe@*nNh!O|l)+I8B#4%L+{DSInBa!GDJ9}#Xk!*mk zK#*@60;BSTaErtRsrF`vZ%71?q;=09P-0Mhx)XX?GZl}EfA8KXKvv|k{OyjJRyz}I zIkCLQG`4#i{m+Mdigf1Pt)w?xI&r69Tmap0DF`rdP(p#6E=W3EPckbqw_na+!@u3# z;fbMCI~FZEU-YJ)pNTiL+6|zO1JgHFMHe+z{p?r@bCrP90O_!a{=67bFM&l(NA4nR ze+%SJIGWLfPENY+A*fA^Y4F(UQg$QndBx=EsN-?&&%0l$TZDgqUFxw+534B|L_{3h z`Nfy3MwT_hZ@IQUe#$pLUe$Vua^ErtxS6`GKn*S!N9oES7`LYRu!5y;8r0hX0j$op zvkhLmy_Zfo_tMy<5xx5Lou(+kdKqbo-&0t6B>lY^YRWBpIbnV7^&b-9mCwOc3Q+FE z_P(xDaN5NTmQCVCjPV3hX|jYR{a^)ip`WHbb_4pJR&Md2ZOjU&yw<|+v;uNq6P~kVI=V?lMdwpqh z&r$Y4=$zEQeqBNLaNbWnLz z>>|{ohxH+0qG`Z)T6HA^XNaoKqdaa7jDpQkNA)#OMcT(no+m3B0Z1p|V?n*BMmF%M zRFL+dOZXlz~zZD8dg8CS*j zi?^YGhhT^}^D}*5S602(OK;lA)TV@OSe*I%ZWIRsxwHt7!PfKjFInin>G*H^8#G>y zd%;ONP=dn@@eC^eqPH+RfO;OuNs-gBz6XIs>XbM@0vuX?P(oJ0auPPx>Odf$6hJMm zh-4SdO!uXtt?x_`RGau~AV;?$V6hjBN~LqKZ+_LM#lO06Qg)pQMzQSO{>NFD9HjTkrtQmQI^E8-6WOb&tUy8M6d)bqbPEB*T|a2x&) z&zt|Q;lFG6zbI4u8#e!j&A(yuZ|?n1%bWk^jeqmT|HHgtjD6k&ZU6&$;@J8B4 + + + + Spring Data MongoDB + Reference Guide + Mark Pollack, Thomas Risberg, + Oliver Gierke, Costin Leau, + Jon Brisbin, Thomas Darimont, + Christoph Strobl, Mark Paluch, + Jay Bryant + diff --git a/src/main/asciidoc/index.adoc b/src/main/asciidoc/index.adoc index 4166c6eff..551ed0bc5 100644 --- a/src/main/asciidoc/index.adoc +++ b/src/main/asciidoc/index.adoc @@ -1,49 +1,48 @@ = Spring Data MongoDB - Reference Documentation -Mark Pollack; Thomas Risberg; Oliver Gierke; Costin Leau; Jon Brisbin; Thomas Darimont; Christoph Strobl; Mark Paluch +Mark Pollack; Thomas Risberg; Oliver Gierke; Costin Leau; Jon Brisbin; Thomas Darimont; Christoph Strobl; Mark Paluch; Jay Bryant :revnumber: {version} :revdate: {localdate} -:toc: -:toc-placement!: +:linkcss: +:doctype: book +:docinfo: shared +:toc: left +:toclevels: 4 +:source-highlighter: prettify +:icons: font +:imagesdir: images +ifdef::backend-epub3[:front-cover-image: image:epub-cover.png[Front Cover,1050,1600]] :spring-data-commons-docs: ../../../../spring-data-commons/src/main/asciidoc (C) 2008-2018 The original authors. -NOTE: _Copies of this document may be made for your own use and for distribution to others, provided that you do not charge any fee for such copies and further provided that each copy contains this Copyright Notice, whether distributed in print or electronically._ - -toc::[] +NOTE: Copies of this document may be made for your own use and for distribution to others, provided that you do not charge any fee for such copies and further provided that each copy contains this Copyright Notice, whether distributed in print or electronically. include::preface.adoc[] -:leveloffset: +1 -include::new-features.adoc[] -include::{spring-data-commons-docs}/dependencies.adoc[] -include::{spring-data-commons-docs}/repositories.adoc[] -:leveloffset: -1 +include::new-features.adoc[leveloffset=+1] +include::{spring-data-commons-docs}/dependencies.adoc[leveloffset=+1] +include::{spring-data-commons-docs}/repositories.adoc[leveloffset=+1] [[reference]] = Reference Documentation -:leveloffset: +1 -include::reference/introduction.adoc[] -include::reference/mongodb.adoc[] -include::reference/reactive-mongodb.adoc[] -include::reference/mongo-repositories.adoc[] -include::reference/reactive-mongo-repositories.adoc[] -include::{spring-data-commons-docs}/auditing.adoc[] -include::reference/mongo-auditing.adoc[] -include::reference/mapping.adoc[] -include::reference/cross-store.adoc[] -include::reference/jmx.adoc[] -include::reference/mongo-3.adoc[] -:leveloffset: -1 +include::reference/introduction.adoc[leveloffset=+1] +include::reference/mongodb.adoc[leveloffset=+1] +include::reference/reactive-mongodb.adoc[leveloffset=+1] +include::reference/mongo-repositories.adoc[leveloffset=+1] +include::reference/reactive-mongo-repositories.adoc[leveloffset=+1] +include::{spring-data-commons-docs}/auditing.adoc[leveloffset=+1] +include::reference/mongo-auditing.adoc[leveloffset=+1] +include::reference/mapping.adoc[leveloffset=+1] +include::reference/cross-store.adoc[leveloffset=+1] +include::reference/jmx.adoc[leveloffset=+1] +include::reference/mongo-3.adoc[leveloffset=+1] [[appendix]] = Appendix :numbered!: -:leveloffset: +1 -include::{spring-data-commons-docs}/repository-namespace-reference.adoc[] -include::{spring-data-commons-docs}/repository-populator-namespace-reference.adoc[] -include::{spring-data-commons-docs}/repository-query-keywords-reference.adoc[] -include::{spring-data-commons-docs}/repository-query-return-types-reference.adoc[] -:leveloffset: -1 +include::{spring-data-commons-docs}/repository-namespace-reference.adoc[leveloffset=+1] +include::{spring-data-commons-docs}/repository-populator-namespace-reference.adoc[leveloffset=+1] +include::{spring-data-commons-docs}/repository-query-keywords-reference.adoc[leveloffset=+1] +include::{spring-data-commons-docs}/repository-query-return-types-reference.adoc[leveloffset=+1] diff --git a/src/main/asciidoc/new-features.adoc b/src/main/asciidoc/new-features.adoc index 2f5e65a90..0f50154f1 100644 --- a/src/main/asciidoc/new-features.adoc +++ b/src/main/asciidoc/new-features.adoc @@ -2,61 +2,60 @@ = New & Noteworthy [[new-features.2-0-0]] -== What's new in Spring Data MongoDB 2.0 +== What's New in Spring Data MongoDB 2.0 * Upgrade to Java 8. -* Usage of the `Document` API instead of `DBObject`. +* Usage of the `Document` API, instead of `DBObject`. * <>. * <> queries. -* Support for aggregation result streaming via Java 8 `Stream`. +* Support for aggregation result streaming by using Java 8 `Stream`. * <> for CRUD and aggregation operations. -* Kotlin extensions for Template and Collection API. +* Kotlin extensions for Template and Collection APIs. * Integration of collations for collection and index creation and query operations. * Query-by-Example support without type matching. -* Add support for isolation ``Update``s. -* Tooling support for null-safety via Spring's `@NonNullApi` and `@Nullable` annotations. +* Support for isolation `Update` operations. +* Tooling support for null-safety by using Spring's `@NonNullApi` and `@Nullable` annotations. * Deprecated cross-store support and removed Log4j appender. [[new-features.1-10-0]] -== What's new in Spring Data MongoDB 1.10 +== What's New in Spring Data MongoDB 1.10 * Compatible with MongoDB Server 3.4 and the MongoDB Java Driver 3.4. -* New annotations for `@CountQuery`, `@DeleteQuery` and `@ExistsQuery`. +* New annotations for `@CountQuery`, `@DeleteQuery`, and `@ExistsQuery`. * Extended support for MongoDB 3.2 and MongoDB 3.4 aggregation operators (see <>). -* Support partial filter expression when creating indexes. -* Publish lifecycle events when loading/converting ``DBRef``s. +* Support for partial filter expression when creating indexes. +* Publishing lifecycle events when loading or converting `DBRef` instances. * Added any-match mode for Query By Example. * Support for `$caseSensitive` and `$diacriticSensitive` text search. * Support for GeoJSON Polygon with hole. -* Performance improvements by bulk fetching ``DBRef``s. -* Multi-faceted aggregations using `$facet`, `$bucket` and `$bucketAuto` via `Aggregation`. +* Performance improvements by bulk-fetching `DBRef` instances. +* Multi-faceted aggregations using `$facet`, `$bucket`, and `$bucketAuto` with `Aggregation`. [[new-features.1-9-0]] -== What's new in Spring Data MongoDB 1.9 -* The following annotations have been enabled to build own, composed annotations: `@Document`, `@Id`, `@Field`, `@Indexed`, `@CompoundIndex`, `@GeoSpatialIndexed`, `@TextIndexed`, `@Query`, `@Meta`. +== What's New in Spring Data MongoDB 1.9 +* The following annotations have been enabled to build your own composed annotations: `@Document`, `@Id`, `@Field`, `@Indexed`, `@CompoundIndex`, `@GeoSpatialIndexed`, `@TextIndexed`, `@Query`, and `@Meta`. * Support for <> in repository query methods. * Support for <>. * Out-of-the-box support for `java.util.Currency` in object mapping. -* Add support for the bulk operations introduced in MongoDB 2.6. +* Support for the bulk operations introduced in MongoDB 2.6. * Upgrade to Querydsl 4. * Assert compatibility with MongoDB 3.0 and MongoDB Java Driver 3.2 (see: <>). [[new-features.1-8-0]] -== What's new in Spring Data MongoDB 1.8 +== What's New in Spring Data MongoDB 1.8 * `Criteria` offers support for creating `$geoIntersects`. -* Support http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#expressions[SpEL expressions] in `@Query`. -* `MongoMappingEvents` expose the collection name they are issued for. +* Support for http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#expressions[SpEL expressions] in `@Query`. +* `MongoMappingEvents` expose the collection name for which they are issued. * Improved support for ``. * Improved index creation failure error message. [[new-features.1-7-0]] -== What's new in Spring Data MongoDB 1.7 +== What's New in Spring Data MongoDB 1.7 * Assert compatibility with MongoDB 3.0 and MongoDB Java Driver 3-beta3 (see: <>). * Support JSR-310 and ThreeTen back-port date/time types. -* Allow `Stream` as query method return type (see: <>). -* Added http://geojson.org/[GeoJSON] support in both domain types and queries (see: <>). +* Allow `Stream` as a query method return type (see: <>). +* http://geojson.org/[GeoJSON] support in both domain types and queries (see: <>). * `QueryDslPredicateExcecutor` now supports `findAll(OrderSpecifier… orders)`. -* Support calling JavaScript functions via <>. -* Improve support for `CONTAINS` keyword on collection like properties. -* Support for `$bit`, `$mul` and `$position` operators to `Update`. - +* Support calling JavaScript functions with <>. +* Improve support for `CONTAINS` keyword on collection-like properties. +* Support for `$bit`, `$mul`, and `$position` operators to `Update`. diff --git a/src/main/asciidoc/preface.adoc b/src/main/asciidoc/preface.adoc index fd7c8ef11..9c24dd6b7 100644 --- a/src/main/asciidoc/preface.adoc +++ b/src/main/asciidoc/preface.adoc @@ -1,59 +1,60 @@ [[preface]] = Preface -The Spring Data MongoDB project applies core Spring concepts to the development of solutions using the MongoDB document style data store. We provide a "template" as a high-level abstraction for storing and querying documents. You will notice similarities to the JDBC support in the Spring Framework. +The Spring Data MongoDB project applies core Spring concepts to the development of solutions that use the MongoDB document style data store. We provide a "`template`" as a high-level abstraction for storing and querying documents. You may notice similarities to the JDBC support provided by the Spring Framework. -This document is the reference guide for Spring Data - Document Support. It explains Document module concepts and semantics and the syntax for various store namespaces. +This document is the reference guide for Spring Data - Document Support. It explains Document module concepts and semantics and syntax for various store namespaces. This section provides some basic introduction to Spring and Document databases. The rest of the document refers only to Spring Data MongoDB features and assumes the user is familiar with MongoDB and Spring concepts. [[get-started:first-steps:spring]] -== Knowing Spring -Spring Data uses Spring framework's http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html[core] functionality, such as the http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#beans[IoC] container, http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#validation[type conversion system], http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#expressions[expression language], http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/integration.html#jmx[JMX integration], and portable http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html#dao-exceptions[DAO exception hierarchy]. While it is not important to know the Spring APIs, understanding the concepts behind them is. At a minimum, the idea behind IoC should be familiar for whatever IoC container you choose to use. +== Learning Spring -The core functionality of the MongoDB support can be used directly, with no need to invoke the IoC services of the Spring Container. This is much like `JdbcTemplate` which can be used 'standalone' without any other services of the Spring container. To leverage all the features of Spring Data MongoDB, such as the repository support, you will need to configure some parts of the library using Spring. +Spring Data uses Spring framework's http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html[core] functionality, including: -To learn more about Spring, you can refer to the comprehensive (and sometimes disarming) documentation that explains in detail the Spring Framework. There are a lot of articles, blog entries and books on the matter - take a look at the Spring framework http://spring.io/docs[home page] for more information. +* http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#beans[IoC] container +* http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#validation[type conversion system] +* http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#expressions[expression language] +* http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/integration.html#jmx[JMX integration] +* http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html#dao-exceptions[DAO exception hierarchy]. + +While you need not know the Spring APIs, understanding the concepts behind them is important. At a minimum, the idea behind Inversion of Control (IoC) should be familiar, and you should be familiar with whatever IoC container you choose to use. + +The core functionality of the MongoDB support can be used directly, with no need to invoke the IoC services of the Spring Container. This is much like `JdbcTemplate`, which can be used "'standalone'" without any other services of the Spring container. To leverage all the features of Spring Data MongoDB, such as the repository support, you need to configure some parts of the library to use Spring. + +To learn more about Spring, you can refer to the comprehensive documentation that explains the Spring Framework in detail. There are a lot of articles, blog entries, and books on the subject. See the Spring framework http://spring.io/docs[home page] for more information. [[get-started:first-steps:nosql]] -== Knowing NoSQL and Document databases -NoSQL stores have taken the storage world by storm. It is a vast domain with a plethora of solutions, terms and patterns (to make things worse even the term itself has multiple http://www.google.com/search?q=nosoql+acronym[meanings]). While some of the principles are common, it is crucial that the user is familiar to some degree with MongoDB. The best way to get acquainted to this solutions is to read their documentation and follow their examples - it usually doesn't take more then 5-10 minutes to go through them and if you are coming from an RDMBS-only background many times these exercises can be an eye opener. +== Learning NoSQL and Document databases +NoSQL stores have taken the storage world by storm. It is a vast domain with a plethora of solutions, terms, and patterns (to make things worse, even the term itself has multiple http://www.google.com/search?q=nosoql+acronym[meanings]). While some of the principles are common, you must be familiar with MongoDB to some degree. The best way to get acquainted is to read the documentation and follow the examples. It usually does not take more then 5-10 minutes to go through them and, especially if you are coming from an RDMBS-only background, these exercises can be an eye opener. -The jumping off ground for learning about MongoDB is http://www.mongodb.org/[www.mongodb.org]. Here is a list of other useful resources: +The starting point for learning about MongoDB is http://www.mongodb.org/[www.mongodb.org]. Here is a list of other useful resources: -* The http://docs.mongodb.org/manual/[manual] introduces MongoDB and contains links to getting started guides, reference documentation and tutorials. +* The http://docs.mongodb.org/manual/[manual] introduces MongoDB and contains links to getting started guides, reference documentation, and tutorials. * The http://try.mongodb.org/[online shell] provides a convenient way to interact with a MongoDB instance in combination with the online http://docs.mongodb.org/manual/tutorial/getting-started/[tutorial.] -* MongoDB http://docs.mongodb.org/ecosystem/drivers/java/[Java Language Center] -* Several http://www.mongodb.org/books[books] available for purchase -* Karl Seguin's online book: http://openmymind.net/mongodb.pdf[The Little MongoDB Book] +* MongoDB http://docs.mongodb.org/ecosystem/drivers/java/[Java Language Center]. +* Several http://www.mongodb.org/books[books] you can purchase. +* Karl Seguin's online book: http://openmymind.net/mongodb.pdf[The Little MongoDB Book]. [[requirements]] == Requirements -Spring Data MongoDB 1.x binaries requires JDK level 6.0 and above, and http://spring.io/docs[Spring Framework] {springVersion} and above. +The Spring Data MongoDB 1.x binaries require JDK level 6.0 and above and http://spring.io/docs[Spring Framework] {springVersion} and above. -In terms of document stores, http://www.mongodb.org/[MongoDB] at least 2.6. - -== Additional Help Resources - -Learning a new framework is not always straight forward. In this section, we try to provide what we think is an easy to follow guide for starting with Spring Data MongoDB module. However, if you encounter issues or you are just looking for an advice, feel free to use one of the links below: +In terms of document stores, you need at least version 2.6 of http://www.mongodb.org/[MongoDB]. [[get-started:help]] -=== Support +== Additional Help Resources -There are a few support options available: +Learning a new framework is not always straightforward. In this section, we try to provide what we think is an easy-to-follow guide for starting with the Spring Data MongoDB module. However, if you encounter issues or you need advice, feel free to use one of the following links: [[get-started:help:community]] -==== Community Forum - -Spring Data on Stackoverflow http://stackoverflow.com/questions/tagged/spring-data[Stackoverflow] is a tag for all Spring Data (not just Document) users to share information and help each other. Note that registration is needed *only* for posting. +Community Forum :: Spring Data on http://stackoverflow.com/questions/tagged/spring-data[Stack Overflow] is a tag for all Spring Data (not just Document) users to share information and help each other. Note that registration is needed only for posting. [[get-started:help:professional]] -==== Professional Support - -Professional, from-the-source support, with guaranteed response time, is available from http://pivotal.io/[Pivotal Sofware, Inc.], the company behind Spring Data and Spring. +Professional Support :: Professional, from-the-source support, with guaranteed response time, is available from http://pivotal.io/[Pivotal Sofware, Inc.], the company behind Spring Data and Spring. [[get-started:up-to-date]] -=== Following Development +== Following Development -For information on the Spring Data Mongo source code repository, nightly builds and snapshot artifacts please see the http://projects.spring.io/spring-data-mongodb/[Spring Data Mongo homepage]. You can help make Spring Data best serve the needs of the Spring community by interacting with developers through the Community on http://stackoverflow.com/questions/tagged/spring-data[Stackoverflow]. To follow developer activity look for the mailing list information on the Spring Data Mongo homepage. If you encounter a bug or want to suggest an improvement, please create a ticket on the Spring Data issue https://jira.spring.io/browse/DATAMONGO[tracker]. To stay up to date with the latest news and announcements in the Spring eco system, subscribe to the Spring Community http://spring.io[Portal]. Lastly, you can follow the Spring http://spring.io/blog[blog ]or the project team on Twitter (http://twitter.com/SpringData[SpringData]). +For information on the Spring Data Mongo source code repository, nightly builds, and snapshot artifacts, see the Spring Data Mongo http://projects.spring.io/spring-data-mongodb/[homepage]. You can help make Spring Data best serve the needs of the Spring community by interacting with developers through the Community on http://stackoverflow.com/questions/tagged/spring-data[Stack Overflow]. To follow developer activity, look for the mailing list information on the Spring Data Mongo https://projects.spring.io/spring-data-mongodb/[homepage]. If you encounter a bug or want to suggest an improvement, please create a ticket on the Spring Data issue https://jira.spring.io/browse/DATAMONGO[tracker]. To stay up to date with the latest news and announcements in the Spring eco system, subscribe to the Spring Community http://spring.io[Portal]. You can also follow the Spring http://spring.io/blog[blog ]or the project team on Twitter (http://twitter.com/SpringData[SpringData]). diff --git a/src/main/asciidoc/reference/cross-store.adoc b/src/main/asciidoc/reference/cross-store.adoc index 763837d62..e266c0916 100644 --- a/src/main/asciidoc/reference/cross-store.adoc +++ b/src/main/asciidoc/reference/cross-store.adoc @@ -1,18 +1,18 @@ [[mongo.cross.store]] -= Cross Store support += Cross Store Support -WARNING: Deprecated - will be removed without replacement. +WARNING: This feature has been deprecated and will be removed without replacement. -Sometimes you need to store data in multiple data stores and these data stores can be of different types. One might be relational while the other a document store. For this use case we have created a separate module in the MongoDB support that handles what we call cross-store support. The current implementation is based on JPA as the driver for the relational database and we allow select fields in the Entities to be stored in a Mongo database. In addition to allowing you to store your data in two stores we also coordinate persistence operations for the non-transactional MongoDB store with the transaction life-cycle for the relational database. +Sometimes you need to store data in multiple data stores, and these data stores need to be of different types. One might be relational while the other is a document store. For this use case, we created a separate module in the MongoDB support that handles what we call "`cross-store support`". The current implementation is based on JPA as the driver for the relational database and we let select fields in the Entities be stored in a Mongo database. In addition to letting you store your data in two stores, we also coordinate persistence operations for the non-transactional MongoDB store with the transaction life-cycle for the relational database. [[mongodb_cross-store-configuration]] == Cross Store Configuration -Assuming that you have a working JPA application and would like to add some cross-store persistence for MongoDB. What do you have to add to your configuration? +Assuming that you have a working JPA application and would like to add some cross-store persistence for MongoDB, what do you have to add to your configuration? -First of all you need to add a dependency on the module. Using Maven this is done by adding a dependency to your pom: +First, you need to add a dependency on the cross-store module. If you use Maven, you can add the following dependency to your pom: -.Example Maven pom.xml with spring-data-mongodb-cross-store dependency +.Example Maven pom.xml with `spring-data-mongodb-cross-store` dependency ==== [source,xml] ---- @@ -35,7 +35,7 @@ First of all you need to add a dependency on the module. Using Maven this is do ---- ==== -Once this is done we need to enable AspectJ for the project. The cross-store support is implemented using AspectJ aspects so by enabling compile time AspectJ support the cross-store features will become available to your project. In Maven you would add an additional plugin to the section of the pom: +Once you have added the dependency, you need to enable AspectJ for the project. The cross-store support is implemented with AspectJ aspects so, if you enable compile-time AspectJ support, the cross-store features become available to your project. In Maven, you would add an additional plugin to the `` section of the pom, as follows: .Example Maven pom.xml with AspectJ plugin enabled ==== @@ -105,7 +105,7 @@ Once this is done we need to enable AspectJ for the project. The cross-store sup ---- ==== -Finally, you need to configure your project to use MongoDB and also configure the aspects that are used. The following XML snippet should be added to your application context: +Finally, you need to configure your project to use MongoDB and also configure which aspects are used. You should add the following XML snippet to your application context: .Example application context with MongoDB and cross-store aspect support ==== @@ -159,7 +159,13 @@ Finally, you need to configure your project to use MongoDB and also configure th [[mongodb_cross-store-application]] == Writing the Cross Store Application -We are assuming that you have a working JPA application so we will only cover the additional steps needed to persist part of your Entity in your Mongo database. First you need to identify the field you want persisted. It should be a domain class and follow the general rules for the Mongo mapping support covered in previous chapters. The field you want persisted in MongoDB should be annotated using the `@RelatedDocument` annotation. That is really all you need to do!. The cross-store aspects take care of the rest. This includes marking the field with `@Transient` so it won't be persisted using JPA, keeping track of any changes made to the field value and writing them to the database on successful transaction completion, loading the document from MongoDB the first time the value is used in your application. Here is an example of a simple Entity that has a field annotated with `@RelatedDocument`. +We assume that you have a working JPA application, so we cover only the additional steps needed to persist part of your entity in your Mongo database. To do so, you need to identify the field you want to persist. It should be a domain class and follow the general rules for the Mongo mapping support covered in previous chapters. The field you want to persist in MongoDB should be annotated with the `@RelatedDocument` annotation. That is really all you need to do. The cross-store aspects take care of the rest, including: + +* Marking the field with `@Transient` so that it will not be persisted by JPA +* Keeping track of any changes made to the field value and writing them to the database on successful transaction completion +* Loading the document from MongoDB the first time the value is used in your application. + +The following example shows an entity that has a field annotated with `@RelatedDocument`: .Example of Entity with @RelatedDocument ==== @@ -184,7 +190,9 @@ public class Customer { ---- ==== -.Example of domain class to be stored as document +The following example shows a domain class that is to be stored as a `Document`: + +.Example of a domain class to be stored as a Document ==== [source,java] ---- @@ -216,7 +224,7 @@ public class SurveyInfo { ---- ==== -Once the SurveyInfo has been set on the Customer object above the MongoTemplate that was configured above is used to save the SurveyInfo along with some metadata about the JPA Entity is stored in a MongoDB collection named after the fully qualified name of the JPA Entity class. The following code: +In the preceding example, once the `SurveyInfo` has been set on the `Customer` object, the `MongoTemplate` that was configured previously is used to save the `SurveyInfo` (along with some metadata about the JPA Entity) in a MongoDB collection named after the fully qualified name of the JPA Entity class. The following code shows how to configure a JPA entity for cross-store persistence with MongoDB: .Example of code using the JPA Entity configured for cross-store persistence ==== @@ -234,7 +242,7 @@ customerRepository.save(customer); ---- ==== -Executing the code above results in the following JSON document stored in MongoDB. +Running the preceding above results in the following JSON document being stored in MongoDB: .Example of JSON document stored in MongoDB ==== diff --git a/src/main/asciidoc/reference/introduction.adoc b/src/main/asciidoc/reference/introduction.adoc index ff14c7a69..d7a90227c 100644 --- a/src/main/asciidoc/reference/introduction.adoc +++ b/src/main/asciidoc/reference/introduction.adoc @@ -5,7 +5,6 @@ This part of the reference documentation explains the core functionality offered by Spring Data MongoDB. -<> introduces the MongoDB module feature set. - -<> introduces the repository support for MongoDB. +"`<>`" introduces the MongoDB module feature set. +"`<>`" introduces the repository support for MongoDB. diff --git a/src/main/asciidoc/reference/jmx.adoc b/src/main/asciidoc/reference/jmx.adoc index 474319c6a..a49b5d196 100644 --- a/src/main/asciidoc/reference/jmx.adoc +++ b/src/main/asciidoc/reference/jmx.adoc @@ -1,12 +1,12 @@ [[mongo.jmx]] = JMX support -The JMX support for MongoDB exposes the results of executing the 'serverStatus' command on the admin database for a single MongoDB server instance. It also exposes an administrative MBean, MongoAdmin which will let you perform administrative operations such as drop or create a database. The JMX features build upon the JMX feature set available in the Spring Framework. See http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/integration.html#jmx[here ] for more details. +The JMX support for MongoDB exposes the results of executing the 'serverStatus' command on the admin database for a single MongoDB server instance. It also exposes an administrative MBean, `MongoAdmin`, that lets you perform administrative operations, such as dropping or creating a database. The JMX features build upon the JMX feature set available in the Spring Framework. See http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/integration.html#jmx[here] for more details. [[mongodb:jmx-configuration]] == MongoDB JMX Configuration -Spring's Mongo namespace enables you to easily enable JMX functionality +Spring's Mongo namespace lets you enable JMX functionality, as the following example shows: .XML schema to configure MongoDB ==== @@ -47,18 +47,18 @@ Spring's Mongo namespace enables you to easily enable JMX functionality ---- ==== -This will expose several MBeans +The preceding code exposes several MBeans: -* AssertMetrics -* BackgroundFlushingMetrics -* BtreeIndexCounters -* ConnectionMetrics -* GlobalLoclMetrics -* MemoryMetrics -* OperationCounters -* ServerInfo -* MongoAdmin +* `AssertMetrics` +* `BackgroundFlushingMetrics` +* `BtreeIndexCounters` +* `ConnectionMetrics` +* `GlobalLockMetrics` +* `MemoryMetrics` +* `OperationCounters` +* `ServerInfo` +* `MongoAdmin` -This is shown below in a screenshot from JConsole +The following screenshot from JConsole shows the resulting configuration: -image::jconsole.png[] \ No newline at end of file +image::jconsole.png[] diff --git a/src/main/asciidoc/reference/mapping.adoc b/src/main/asciidoc/reference/mapping.adoc index f799f1e97..03b3882a8 100644 --- a/src/main/asciidoc/reference/mapping.adoc +++ b/src/main/asciidoc/reference/mapping.adoc @@ -1,23 +1,23 @@ [[mapping-chapter]] = Mapping -Rich mapping support is provided by the `MappingMongoConverter`. `MappingMongoConverter` has a rich metadata model that provides a full feature set of functionality to map domain objects to MongoDB documents.The mapping metadata model is populated using annotations on your domain objects. However, the infrastructure is not limited to using annotations as the only source of metadata information. The `MappingMongoConverter` also allows you to map objects to documents without providing any additional metadata, by following a set of conventions. +Rich mapping support is provided by the `MappingMongoConverter`. `MappingMongoConverter` has a rich metadata model that provides a full feature set to map domain objects to MongoDB documents. The mapping metadata model is populated by using annotations on your domain objects. However, the infrastructure is not limited to using annotations as the only source of metadata information. The `MappingMongoConverter` also lets you map objects to documents without providing any additional metadata, by following a set of conventions. -In this section we will describe the features of the `MappingMongoConverter`. How to use conventions for mapping objects to documents and how to override those conventions with annotation based mapping metadata. +This section describes the features of the `MappingMongoConverter`, including how to use conventions for mapping objects to documents and how to override those conventions with annotation-based mapping metadata. [[mapping-conventions]] -== Convention based Mapping +== Convention-based Mapping `MappingMongoConverter` has a few conventions for mapping objects to documents when no additional mapping metadata is provided. The conventions are: -* The short Java class name is mapped to the collection name in the following manner. The class `com.bigbank.SavingsAccount` maps to `savingsAccount` collection name. -* All nested objects are stored as nested objects in the document and *not* as DBRefs -* The converter will use any Spring Converters registered with it to override the default mapping of object properties to document field/values. -* The fields of an object are used to convert to and from fields in the document. Public JavaBean properties are not used. -* You can have a single non-zero argument constructor whose constructor argument names match top level field names of document, that constructor will be used. Otherwise the zero arg constructor will be used. if there is more than one non-zero argument constructor an exception will be thrown. +* The short Java class name is mapped to the collection name in the following manner. The class `com.bigbank.SavingsAccount` maps to the `savingsAccount` collection name. +* All nested objects are stored as nested objects in the document and *not* as DBRefs. +* The converter uses any Spring Converters registered with it to override the default mapping of object properties to document fields and values. +* The fields of an object are used to convert to and from fields in the document. Public `JavaBean` properties are not used. +* If you have a single non-zero-argument constructor whose constructor argument names match top-level field names of document, that constructor is used. Otherwise, the zero-argument constructor is used. If there is more than one non-zero-argument constructor, an exception will be thrown. [[mapping.conventions.id-field]] -=== How the `_id` field is handled in the mapping layer +=== How the `_id` field is handled in the mapping layer. MongoDB requires that you have an `_id` field for all documents. If you don't provide one the driver will assign a ObjectId with a generated value. The "_id" field can be of any type the, other than arrays, so long as it is unique. The driver naturally supports all primitive types and Dates. When using the `MappingMongoConverter` there are certain rules that govern how properties from the Java class is mapped to this `_id` field. @@ -58,10 +58,12 @@ The following outlines what type conversion, if any, will be done on the propert When querying and updating `MongoTemplate` will use the converter to handle conversions of the `Query` and `Update` objects that correspond to the above rules for saving documents so field names and types used in your queries will be able to match what is in your domain classes. [[mapping-conversion]] -== Data mapping and type conversion +== Data Mapping and Type Conversion -This section explain how types are mapped to a MongoDB representation and vice versa. Spring Data MongoDB supports all types that can be represented as BSON, MongoDB's internal document format. -In addition to these types, Spring Data MongoDB provides a set of built-in converters to map additional types. You can provide your own converters to adjust type conversion, see <> for further details. +This section explains how types are mapped to and from a MongoDB representation. Spring Data MongoDB supports all types that can be represented as BSON, MongoDB's internal document format. +In addition to these types, Spring Data MongoDB provides a set of built-in converters to map additional types. You can provide your own converters to adjust type conversion. See <> for further details. + +The following provides samples of each available type conversion: [cols="3,1,6", options="header"] .Type @@ -244,9 +246,9 @@ calling `get()` before the actual conversion [[mapping-configuration]] == Mapping Configuration -Unless explicitly configured, an instance of `MappingMongoConverter` is created by default when creating a `MongoTemplate`. You can create your own instance of the `MappingMongoConverter` so as to tell it where to scan the classpath at startup your domain classes in order to extract metadata and construct indexes. Also, by creating your own instance you can register Spring converters to use for mapping specific classes to and from the database. +Unless explicitly configured, an instance of `MappingMongoConverter` is created by default when you create a `MongoTemplate`. You can create your own instance of the `MappingMongoConverter`. Doing so lets you dictate where to in the classpath your domain classes can be found, so that Spring Data MongoDB can extract metadata and construct indexes. Also, by creating your own instance, you can register Spring converters to map specific classes to and from the database. -You can configure the `MappingMongoConverter` as well as `com.mongodb.MongoClient` and MongoTemplate either using Java or XML based metadata. Here is an example using Spring's Java based configuration +You can configure the `MappingMongoConverter` as well as `com.mongodb.MongoClient` and MongoTemplate by using either Java-based or XML-based metadata. The following example uses Spring's Java-based configuration: .@Configuration class to configure MongoDB mapping support ==== @@ -290,15 +292,15 @@ public class GeoSpatialAppConfig extends AbstractMongoConfiguration { ---- ==== -`AbstractMongoConfiguration` requires you to implement methods that define a `com.mongodb.MongoClient` as well as provide a database name. `AbstractMongoConfiguration` also has a method you can override named `getMappingBasePackage(…)` which tells the converter where to scan for classes annotated with the `@Document` annotation. +`AbstractMongoConfiguration` requires you to implement methods that define a `com.mongodb.MongoClient` as well as provide a database name. `AbstractMongoConfiguration` also has a method named `getMappingBasePackage(…)` that you can override to tell the converter where to scan for classes annotated with the `@Document` annotation. -You can add additional converters to the converter by overriding the method afterMappingMongoConverterCreation. Also shown in the above example is a `LoggingEventListener` which logs `MongoMappingEvent` s that are posted onto Spring's `ApplicationContextEvent` infrastructure. +You can add additional converters to the converter by overriding the `afterMappingMongoConverterCreation` method. Also shown in the preceding example is a `LoggingEventListener`, which logs `MongoMappingEvent` instances that are posted onto Spring's `ApplicationContextEvent` infrastructure. -NOTE: AbstractMongoConfiguration will create a MongoTemplate instance and registered with the container under the name `mongoTemplate`. +NOTE: `AbstractMongoConfiguration` creates a `MongoTemplate` instance and registers it with the container under the name `mongoTemplate`. -You can also override the method `UserCredentials getUserCredentials()` to provide the username and password information to connect to the database. +You can also override the `UserCredentials getUserCredentials()` method to provide the username and password information to connect to the database. -Spring's MongoDB namespace enables you to easily enable mapping functionality in XML +Spring's MongoDB namespace lets you enable mapping functionality in XML, as the following example shows: .XML schema to configure MongoDB mapping support ==== @@ -345,9 +347,9 @@ Spring's MongoDB namespace enables you to easily enable mapping functionality in The `base-package` property tells it where to scan for classes annotated with the `@org.springframework.data.mongodb.core.mapping.Document` annotation. [[mapping-usage]] -== Metadata based Mapping +== Metadata-based Mapping -To take full advantage of the object mapping functionality inside the Spring Data/MongoDB support, you should annotate your mapped objects with the `@Document` annotation. Although it is not necessary for the mapping framework to have this annotation (your POJOs will be mapped correctly, even without any annotations), it allows the classpath scanner to find and pre-process your domain objects to extract the necessary metadata. If you don't use this annotation, your application will take a slight performance hit the first time you store a domain object because the mapping framework needs to build up its internal metadata model so it knows about the properties of your domain object and how to persist them. +To take full advantage of the object mapping functionality inside the Spring Data MongoDB support, you should annotate your mapped objects with the `@Document` annotation. Although it is not necessary for the mapping framework to have this annotation (your POJOs are mapped correctly, even without any annotations), it lets the classpath scanner find and pre-process your domain objects to extract the necessary metadata. If you do not use this annotation, your application takes a slight performance hit the first time you store a domain object, because the mapping framework needs to build up its internal metadata model so that it knows about the properties of your domain object and how to persist them. The following example shows a domain object: .Example domain object ==== @@ -372,28 +374,28 @@ public class Person { ---- ==== -IMPORTANT: The `@Id` annotation tells the mapper which property you want to use for the MongoDB `_id` property and the `@Indexed` annotation tells the mapping framework to call `createIndex(…)` on that property of your document, making searches faster. +IMPORTANT: The `@Id` annotation tells the mapper which property you want to use for the MongoDB `_id` property, and the `@Indexed` annotation tells the mapping framework to call `createIndex(…)` on that property of your document, making searches faster. IMPORTANT: Automatic index creation is only done for types annotated with `@Document`. [[mapping-usage-annotations]] -=== Mapping annotation overview +=== Mapping Annotation Overview -The MappingMongoConverter can use metadata to drive the mapping of objects to documents. An overview of the annotations is provided below +The MappingMongoConverter can use metadata to drive the mapping of objects to documents. The following annotations are available: -* `@Id` - applied at the field level to mark the field used for identity purpose. -* `@Document` - applied at the class level to indicate this class is a candidate for mapping to the database. You can specify the name of the collection where the database will be stored. -* `@DBRef` - applied at the field to indicate it is to be stored using a com.mongodb.DBRef. -* `@Indexed` - applied at the field level to describe how to index the field. -* `@CompoundIndex` - applied at the type level to declare Compound Indexes -* `@GeoSpatialIndexed` - applied at the field level to describe how to geoindex the field. -* `@TextIndexed` - applied at the field level to mark the field to be included in the text index. -* `@Language` - applied at the field level to set the language override property for text index. -* `@Transient` - by default all private fields are mapped to the document, this annotation excludes the field where it is applied from being stored in the database -* `@PersistenceConstructor` - marks a given constructor - even a package protected one - to use when instantiating the object from the database. Constructor arguments are mapped by name to the key values in the retrieved Document. -* `@Value` - this annotation is part of the Spring Framework . Within the mapping framework it can be applied to constructor arguments. This lets you use a Spring Expression Language statement to transform a key's value retrieved in the database before it is used to construct a domain object. In order to reference a property of a given document one has to use expressions like: `@Value("#root.myProperty")` where `root` refers to the root of the given document. -* `@Field` - applied at the field level and described the name of the field as it will be represented in the MongoDB BSON document thus allowing the name to be different than the fieldname of the class. -* `@Version` - applied at field level is used for optimistic locking and checked for modification on save operations. The initial value is `zero` which is bumped automatically on every update. +* `@Id`: Applied at the field level to mark the field used for identity purpose. +* `@Document`: Applied at the class level to indicate this class is a candidate for mapping to the database. You can specify the name of the collection where the database will be stored. +* `@DBRef`: Applied at the field to indicate it is to be stored using a com.mongodb.DBRef. +* `@Indexed`: Applied at the field level to describe how to index the field. +* `@CompoundIndex`: Applied at the type level to declare Compound Indexes +* `@GeoSpatialIndexed`: Applied at the field level to describe how to geoindex the field. +* `@TextIndexed`: Applied at the field level to mark the field to be included in the text index. +* `@Language`: Applied at the field level to set the language override property for text index. +* `@Transient`: By default all private fields are mapped to the document, this annotation excludes the field where it is applied from being stored in the database +* `@PersistenceConstructor`: Marks a given constructor - even a package protected one - to use when instantiating the object from the database. Constructor arguments are mapped by name to the key values in the retrieved Document. +* `@Value`: This annotation is part of the Spring Framework . Within the mapping framework it can be applied to constructor arguments. This lets you use a Spring Expression Language statement to transform a key's value retrieved in the database before it is used to construct a domain object. In order to reference a property of a given document one has to use expressions like: `@Value("#root.myProperty")` where `root` refers to the root of the given document. +* `@Field`: Applied at the field level and described the name of the field as it will be represented in the MongoDB BSON document thus allowing the name to be different than the fieldname of the class. +* `@Version`: Applied at field level is used for optimistic locking and checked for modification on save operations. The initial value is `zero` which is bumped automatically on every update. The mapping metadata infrastructure is defined in a separate spring-data-commons project that is technology agnostic. Specific subclasses are using in the MongoDB support to support annotation based metadata. Other strategies are also possible to put in place if there is demand. @@ -528,7 +530,7 @@ public class Person { NOTE: The text index feature is disabled by default for mongodb v.2.4. -Creating a text index allows accumulating several fields into a searchable full text index. It is only possible to have one text index per collection so all fields marked with `@TextIndexed` are combined into this index. Properties can be weighted to influence document score for ranking results. The default language for the text index is english, to change the default language set `@Document(language="spanish")` to any language you want. Using a property called `language` or `@Language` allows to define a language override on a per document base. +Creating a text index allows accumulating several fields into a searchable full-text index. It is only possible to have one text index per collection, so all fields marked with `@TextIndexed` are combined into this index. Properties can be weighted to influence the document score for ranking results. The default language for the text index is English. To change the default language, set the `language` attribute to whichever language you want (for example,`@Document(language="spanish")`). Using a property called `language` or `@Language` lets you define a language override on a per document base. The following example shows how to created a text index and set the language to Spanish: .Example Text Index Usage ==== @@ -555,9 +557,9 @@ class Nested { [[mapping-usage-references]] === Using DBRefs -The mapping framework doesn't have to store child objects embedded within the document. You can also store them separately and use a DBRef to refer to that document. When the object is loaded from MongoDB, those references will be eagerly resolved and you will get back a mapped object that looks the same as if it had been stored embedded within your master document. +The mapping framework does not have to store child objects embedded within the document. You can also store them separately and use a DBRef to refer to that document. When the object is loaded from MongoDB, those references are eagerly resolved so that you get back a mapped object that looks the same as if it had been stored embedded within your master document. -Here's an example of using a DBRef to refer to a specific document that exists independently of the object in which it is referenced (both classes are shown in-line for brevity's sake): +The following example uses a DBRef to refer to a specific document that exists independently of the object in which it is referenced (both classes are shown in-line for brevity's sake): ==== [source,java] @@ -583,29 +585,29 @@ public class Person { ---- ==== -There's no need to use something like `@OneToMany` because the mapping framework sees that you want a one-to-many relationship because there is a List of objects. When the object is stored in MongoDB, there will be a list of DBRefs rather than the `Account` objects themselves. +You need not use `@OneToMany` or similar mechanisms because the List of objects tells the mapping framework sees that you want a one-to-many relationship. When the object is stored in MongoDB, there is a list of DBRefs rather than the `Account` objects themselves. -IMPORTANT: The mapping framework does not handle cascading saves. If you change an `Account` object that is referenced by a `Person` object, you must save the Account object separately. Calling `save` on the `Person` object will not automatically save the `Account` objects in the property `accounts`. +IMPORTANT: The mapping framework does not handle cascading saves. If you change an `Account` object that is referenced by a `Person` object, you must save the `Account` object separately. Calling `save` on the `Person` object does not automatically save the `Account` objects in the `accounts` property. [[mapping-usage-events]] === Mapping Framework Events Events are fired throughout the lifecycle of the mapping process. This is described in the <> section. -Simply declaring these beans in your Spring ApplicationContext will cause them to be invoked whenever the event is dispatched. +Declaring these beans in your Spring ApplicationContext causes them to be invoked whenever the event is dispatched. [[mapping-explicit-converters]] -=== Overriding Mapping with explicit Converters +=== Overriding Mapping with Explicit Converters -When storing and querying your objects it is convenient to have a `MongoConverter` instance handle the mapping of all Java types to Documents. However, sometimes you may want the `MongoConverter` s do most of the work but allow you to selectively handle the conversion for a particular type or to optimize performance. +When storing and querying your objects, it is convenient to have a `MongoConverter` instance handle the mapping of all Java types to `Document` instances. However, sometimes you may want the `MongoConverter` instances do most of the work but let you selectively handle the conversion for a particular type -- perhaps to optimize performance. -To selectively handle the conversion yourself, register one or more one or more `org.springframework.core.convert.converter.Converter` instances with the MongoConverter. +To selectively handle the conversion yourself, register one or more one or more `org.springframework.core.convert.converter.Converter` instances with the `MongoConverter`. -NOTE: Spring 3.0 introduced a core.convert package that provides a general type conversion system. This is described in detail in the Spring reference documentation section entitled http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#validation[Spring Type Conversion]. +NOTE: Spring 3.0 introduced a core.convert package that provides a general type conversion system. This is described in detail in the Spring reference documentation section entitled http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#validation["`Spring Type Conversion`"]. -The method `customConversions` in `AbstractMongoConfiguration` can be used to configure Converters. The examples <> at the beginning of this chapter show how to perform the configuration using Java and XML. +You can use the `customConversions` method in `AbstractMongoConfiguration` to configure converters. The examples <> show how to perform the configuration using Java and XML. -Below is an example of a Spring Converter implementation that converts from a Document to a Person POJO. +The following example of a Spring Converter implementation converts from a `Document` to a `Person` POJO: [source,java] ---- @@ -620,7 +622,7 @@ Below is an example of a Spring Converter implementation that converts from a Do } ---- -Here is an example that converts from a Person to a Document. +The following example converts from a `Person` to a `Document`: [source,java] ---- diff --git a/src/main/asciidoc/reference/mongo-3.adoc b/src/main/asciidoc/reference/mongo-3.adoc index 3ab427b4d..83c224971 100644 --- a/src/main/asciidoc/reference/mongo-3.adoc +++ b/src/main/asciidoc/reference/mongo-3.adoc @@ -1,24 +1,26 @@ [[mongo.mongo-3]] = MongoDB 3.0 Support -Spring Data MongoDB allows usage of both MongoDB Java driver generations 2 and 3 when connecting to a MongoDB 2.6/3.0 server running _MMap.v1_ or a MongoDB server 3.0 using _MMap.v1_ or the _WiredTiger_ storage engine. +Spring Data MongoDB allows usage of both MongoDB Java driver generations 2 and 3 when connecting to a MongoDB 2.6/3.0 server running MMap.v1 or a MongoDB server 3.0 using MMap.v1 or the WiredTiger storage engine. -NOTE: Please refer to the driver and database specific documentation for major differences between those. +NOTE: See the driver- and database-specific documentation for major differences between those engines. -NOTE: Operations that are no longer valid using a 3.x MongoDB Java driver have been deprecated within Spring Data and will be removed in a subsequent release. +NOTE: Operations that are no longer valid when using a 3.x MongoDB Java driver have been deprecated within Spring Data and will be removed in a subsequent release. == Using Spring Data MongoDB with MongoDB 3.0 +The rest of this section describes how to use Spring Data MongoDB with MongoDB 3.0. + [[mongo.mongo-3.configuration]] === Configuration Options -Some of the configuration options have been changed / removed for the _mongo-java-driver_. The following options will be ignored using the generation 3 driver: +Some of the configuration options have been changed or removed for the `mongo-java-driver`. The following options are ignored when using the generation 3 driver: - * autoConnectRetry - * maxAutoConnectRetryTime - * slaveOk + * `autoConnectRetry` + * `maxAutoConnectRetryTime` + * `slaveOk` -Generally it is recommended to use the `` and `` elements instead of `` when doing XML based configuration, since those elements will only provide you with attributes valid for the 3 generation java driver. +Generally, you should use the `` and `` elements instead of `` when doing XML based configuration, since those elements provide you with attributes that are only valid for the third generation Java driver. The follwoing example shows how to configure a Mongo client connection: [source,xml] ---- @@ -37,14 +39,14 @@ Generally it is recommended to use the `` and ``. +In order to use authentication with XML configuration, you can use the `credentials` attribute on ``, as the following example shows: [source,xml] ---- @@ -82,14 +84,13 @@ In order to use authentication with XML configuration use the `credentials` attr ---- [[mongo.mongo-3.misc]] -=== Other things to be aware of +=== Miscellaneous Details -This section covers additional things to keep in mind when using the 3.0 driver. +This section covers briefly lists additional things to keep in mind when using the 3.0 driver: * `IndexOperations.resetIndexCache()` is no longer supported. * Any `MapReduceOptions.extraOption` is silently ignored. -* `WriteResult` does not longer hold error information but throws an Exception. -* `MongoOperations.executeInSession(…)` no longer calls `requestStart` / `requestDone`. -* Index name generation has become a driver internal operations, still we use the 2.x schema to generate names. -* Some Exception messages differ between the generation 2 and 3 servers as well as between _MMap.v1_ and _WiredTiger_ storage engine. - +* `WriteResult` no longer holds error information but, instead, throws an `Exception`. +* `MongoOperations.executeInSession(…)` no longer calls `requestStart` and `requestDone`. +* Index name generation has become a driver-internal operation. Spring Data MongoDB still uses the 2.x schema to generate names. +* Some `Exception` messages differ between the generation 2 and 3 servers as well as between the MMap.v1 and WiredTiger storage engines. diff --git a/src/main/asciidoc/reference/mongo-auditing.adoc b/src/main/asciidoc/reference/mongo-auditing.adoc index 25cce58b3..90a8e5ce7 100644 --- a/src/main/asciidoc/reference/mongo-auditing.adoc +++ b/src/main/asciidoc/reference/mongo-auditing.adoc @@ -1,9 +1,9 @@ [[mongo.auditing]] -== General auditing configuration +== General Auditing Configuration for MongoDB -Activating auditing functionality is just a matter of adding the Spring Data Mongo `auditing` namespace element to your configuration: +To activate auditing functionality, add the Spring Data Mongo `auditing` namespace element to your configuration, as the following example shows: -.Activating auditing using XML configuration +.Activating auditing by using XML configuration ==== [source,xml] ---- @@ -11,7 +11,7 @@ Activating auditing functionality is just a matter of adding the Spring Data Mon ---- ==== -Since Spring Data MongoDB 1.4 auditing can be enabled by annotating a configuration class with the `@EnableMongoAuditing` annotation. +Since Spring Data MongoDB 1.4, auditing can be enabled by annotating a configuration class with the `@EnableMongoAuditing` annotation, as the followign example shows: .Activating auditing using JavaConfig ==== @@ -29,5 +29,4 @@ class Config { ---- ==== -If you expose a bean of type `AuditorAware` to the `ApplicationContext`, the auditing infrastructure will pick it up automatically and use it to determine the current user to be set on domain types. If you have multiple implementations registered in the `ApplicationContext`, you can select the one to be used by explicitly setting the `auditorAwareRef` attribute of `@EnableMongoAuditing`. - +If you expose a bean of type `AuditorAware` to the `ApplicationContext`, the auditing infrastructure picks it up automatically and uses it to determine the current user to be set on domain types. If you have multiple implementations registered in the `ApplicationContext`, you can select the one to be used by explicitly setting the `auditorAwareRef` attribute of `@EnableMongoAuditing`. diff --git a/src/main/asciidoc/reference/mongo-repositories.adoc b/src/main/asciidoc/reference/mongo-repositories.adoc index 567e08790..9bb5c4fdb 100644 --- a/src/main/asciidoc/reference/mongo-repositories.adoc +++ b/src/main/asciidoc/reference/mongo-repositories.adoc @@ -1,15 +1,15 @@ [[mongo.repositories]] -= MongoDB repositories += MongoDB Repositories [[mongo-repo-intro]] == Introduction -This chapter will point out the specialties for repository support for MongoDB. This builds on the core repository support explained in <>. So make sure you've got a sound understanding of the basic concepts explained there. +This chapter points out the specialties for repository support for MongoDB. This chapter builds on the core repository support explained in <>. You should have a sound understanding of the basic concepts explained there. [[mongo-repo-usage]] == Usage -To access domain entities stored in a MongoDB you can leverage our sophisticated repository support that eases implementing those quite significantly. To do so, simply create an interface for your repository: +To access domain entities stored in a MongoDBm, you can use our sophisticated repository support that eases implementation quite significantly. To do so, create an interface for your repository, as the following example shows: .Sample Person entity ==== @@ -28,7 +28,7 @@ public class Person { ---- ==== -We have a quite simple domain object here. Note that it has a property named `id` of type `ObjectId`. The default serialization mechanism used in `MongoTemplate` (which is backing the repository support) regards properties named id as document id. Currently we support `String`, `ObjectId` and `BigInteger` as id-types. +Note that the domain type shown in the preceding example has a property named `id` of type `ObjectId`. The default serialization mechanism used in `MongoTemplate` (which backs the repository support) regards properties named `id` as the document ID. Currently, we support `String`, `ObjectId`, and `BigInteger` as ID types. Now that we have a domain object, we can define an interface that uses it, as follows: .Basic repository interface to persist Person entities ==== @@ -41,7 +41,7 @@ public interface PersonRepository extends PagingAndSortingRepository findAllBy(); <5> } ---- -<1> The method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of `{"lastname" : lastname}`. -<2> Applies pagination to a query. Just equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and we will automatically page the query accordingly. -<3> Shows that you can query based on properties which are not a primitive type. Throws `IncorrectResultSizeDataAccessException` if more than one match found. -<4> Uses the `First` keyword to restrict the query to the very first result. Unlike <3> this method does not throw an exception if more than one match was found. -<5> Uses a Java 8 `Stream` which reads and converts individual elements while iterating the stream. +<1> The `findByLastname` method shows a query for all people with the given last name. The query is derived by parsing the method name for constraints that can be concatenated with `And` and `Or`. Thus, the method name results in a query expression of `{"lastname" : lastname}`. +<2> Applies pagination to a query. You can equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and Spring Data automatically pages the query accordingly. +<3> Shows that you can query based on properties that are not primitive types. Throws `IncorrectResultSizeDataAccessException` if more than one match is found. +<4> Uses the `First` keyword to restrict the query to only the first result. Unlike <3>, this method does not throw an exception if more than one match is found. +<5> Uses a Java 8 `Stream` that reads and converts individual elements while iterating the stream. ==== -NOTE: Note that for version 1.0 we currently don't support referring to parameters that are mapped as `DBRef` in the domain class. +NOTE: For version 1.0, we currently do not support referring to parameters that are mapped as `DBRef` in the domain class. + +The following table shows the keywords that are supported for query methods: [cols="1,2,3", options="header"] .Supported keywords for query methods @@ -277,9 +279,9 @@ NOTE: Note that for version 1.0 we currently don't support referring to paramete NOTE: If the property criterion compares a document, the order of the fields and exact equality in the document matters. [[mongodb.repositories.queries.delete]] -=== Repository delete queries +=== Repository Delete Queries -The above keywords can be used in conjunction with `delete…By` or `remove…By` to create queries deleting matching documents. +The keywords in the preceding table can be used in conjunction with `delete…By` or `remove…By` to create queries that delete matching documents. .`Delete…By` Query ==== @@ -294,12 +296,14 @@ public interface PersonRepository extends MongoRepository { ---- ==== -Using return type `List` will retrieve and return all matching documents before actually deleting them. A numeric return type directly removes the matching documents returning the total number of documents removed. +Using a return type of `List` retrieves and returns all matching documents before actually deleting them. A numeric return type directly removes the matching documents, returning the total number of documents removed. [[mongodb.repositories.queries.geo-spatial]] -=== Geo-spatial repository queries +=== Geo-spatial Repository Queries -As you've just seen there are a few keywords triggering geo-spatial operations within a MongoDB query. The `Near` keyword allows some further modification. Let's have a look at some examples: +As you saw in the preceding table of keywords, a few keywords trigger geo-spatial operations within a MongoDB query. The `Near` keyword allows some further modification, as the next few examples show. + +The following example shows how to define a `near` query that finds all persons with a given distance of a given point: .Advanced `Near` queries ==== @@ -313,7 +317,7 @@ public interface PersonRepository extends MongoRepository ---- ==== -Adding a `Distance` parameter to the query method allows restricting results to those within the given distance. If the `Distance` was set up containing a `Metric` we will transparently use `$nearSphere` instead of $code. +Adding a `Distance` parameter to the query method allows restricting results to those within the given distance. If the `Distance` was set up containing a `Metric`, we transparently use `$nearSphere` instead of $code, as the following example shows: .Using `Distance` with `Metrics` ==== @@ -326,11 +330,15 @@ Distance distance = new Distance(200, Metrics.KILOMETERS); ---- ==== -As you can see using a `Distance` equipped with a `Metric` causes `$nearSphere` clause to be added instead of a plain `$near`. Beyond that the actual distance gets calculated according to the `Metrics` used. +Using a `Distance` with a `Metric` causes a `$nearSphere` (instead of a plain `$near`) clause to be added. Beyond that, the actual distance gets calculated according to the `Metrics` used. -NOTE: Using `@GeoSpatialIndexed(type = GeoSpatialIndexType.GEO_2DSPHERE)` on the target property forces usage of `$nearSphere` operator. +(Note that `Metric` does not refer to metric units of measure. It could be miles rather than kilometers. Rather, `metric` refers to the concept of a system of measurement, regardless of which system you use.) -==== Geo-near queries +NOTE: Using `@GeoSpatialIndexed(type = GeoSpatialIndexType.GEO_2DSPHERE)` on the target property forces usage of the `$nearSphere` operator. + +==== Geo-near Queries + +Spring Data MongoDb supports geo-near queries, as the following example shows: [source,java] ---- @@ -355,9 +363,9 @@ public interface PersonRepository extends MongoRepository ---- [[mongodb.repositories.queries.json-based]] -=== MongoDB JSON based query methods and field restriction +=== MongoDB JSON-based Query Methods and Field Restriction -By adding the annotation `org.springframework.data.mongodb.repository.Query` repository finder methods you can specify a MongoDB JSON query string to use instead of having the query derived from the method name. For example +By adding the `org.springframework.data.mongodb.repository.Query` annotation to your repository finder methods, you can specify a MongoDB JSON query string to use instead of having the query be derived from the method name, as the following example shows: [source,java] ---- @@ -369,11 +377,11 @@ public interface PersonRepository extends MongoRepository } ---- -The placeholder `?0` lets you substitute the value from the method arguments into the JSON query string. +The `?0` placeholder lets you substitute the value from the method arguments into the JSON query string. -NOTE: `String` parameter values are escaped during the binding process, which means that it is not possible to add MongoDB specific operators via the argument. +NOTE: `String` parameter values are escaped during the binding process, which means that it is not possible to add MongoDB specific operators through the argument. -You can also use the filter property to restrict the set of properties that will be mapped into the Java object. For example, +You can also use the filter property to restrict the set of properties that is mapped into the Java object, as the following example shows: [source,java] ---- @@ -385,16 +393,16 @@ public interface PersonRepository extends MongoRepository } ---- -This will return only the firstname, lastname and Id properties of the Person objects. The age property, a java.lang.Integer, will not be set and its value will therefore be null. +The query in the preceding example returns only the `firstname`, `lastname` and `Id` properties of the `Person` objects. The `age` property, a java.lang.Integer, is not set and its value is therefore null. [[mongodb.repositories.queries.json-spel]] -=== JSON based queries with SpEL expressions +=== JSON-based Queries with SpEL Expressions Query strings and field definitions can be used together with SpEL expressions to create dynamic queries at runtime. SpEL expressions can provide predicate values and can be used to extend predicates with subdocuments. -Expressions expose method arguments through an array that contains all arguments. The the following query uses `[0]` -to declare the predicate value for `lastname` that is equivalent to the `?0` parameter binding. +Expressions expose method arguments through an array that contains all the arguments. The following query uses `[0]` +to declare the predicate value for `lastname` (which is equivalent to the `?0` parameter binding): [source,java] ---- @@ -405,8 +413,8 @@ public interface PersonRepository extends MongoRepository } ---- -Expressions can be used to invoke functions, evaluate conditionals and construct values. SpEL expressions -reveal in conjunction with JSON a side-effect as Map-like declarations inside of SpEL read like JSON. +Expressions can be used to invoke functions, evaluate conditionals, and construct values. SpEL expressions +used in conjunction with JSON reveal a side-effect, because Map-like declarations inside of SpEL read like JSON, as the following example shows: [source,java] ---- @@ -417,12 +425,12 @@ public interface PersonRepository extends MongoRepository } ---- -SpEL in query strings can be a powerful way to enhance queries and can accept a broad range of unwanted arguments. -You should make sure to sanitize strings before passing these to the query to avoid unwanted changes to your query. +SpEL in query strings can be a powerful way to enhance queries. However, they can also accept a broad range of unwanted arguments. +You should make sure to sanitize strings before passing them to the query to avoid unwanted changes to your query. -Expression support is extensible through the Query SPI `org.springframework.data.repository.query.spi.EvaluationContextExtension` -than can contribute properties, functions and customize the root object. Extensions are retrieved from the application context -at the time of SpEL evaluation when the query is build. +Expression support is extensible through the Query SPI: `org.springframework.data.repository.query.spi.EvaluationContextExtension`. +The Query SPI can contribute properties and functions and can customize the root object. Extensions are retrieved from the application context +at the time of SpEL evaluation when the query is built. The following example shows how to use `EvaluationContextExtension`: [source,java] ---- @@ -444,19 +452,19 @@ NOTE: Bootstrapping `MongoRepositoryFactory` yourself is not application context to pick up Query SPI extensions. [[mongodb.repositories.queries.type-safe]] -=== Type-safe Query methods +=== Type-safe Query Methods -MongoDB repository support integrates with the http://www.querydsl.com/[QueryDSL] project which provides a means to perform type-safe queries in Java. To quote from the project description, "Instead of writing queries as inline strings or externalizing them into XML files they are constructed via a fluent API." It provides the following features +MongoDB repository support integrates with the http://www.querydsl.com/[QueryDSL] project, which provides a way to perform type-safe queries. To quote from the project description, "Instead of writing queries as inline strings or externalizing them into XML files they are constructed via a fluent API." It provides the following features: -* Code completion in IDE (all properties, methods and operations can be expanded in your favorite Java IDE) -* Almost no syntactically invalid queries allowed (type-safe on all levels) -* Domain types and properties can be referenced safely (no Strings involved!) -* Adopts better to refactoring changes in domain types -* Incremental query definition is easier +* Code completion in the IDE (all properties, methods, and operations can be expanded in your favorite Java IDE). +* Almost no syntactically invalid queries allowed (type-safe on all levels). +* Domain types and properties can be referenced safely -- no strings involved! +* Adapts better to refactoring changes in domain types. +* Incremental query definition is easier. -Please refer to the http://www.querydsl.com/static/querydsl/latest/reference/html/[QueryDSL documentation] which describes how to bootstrap your environment for APT based code generation using Maven or Ant. +See the http://www.querydsl.com/static/querydsl/latest/reference/html/[QueryDSL documentation] for how to bootstrap your environment for APT-based code generation using Maven or Ant. -Using QueryDSL you will be able to write queries as shown below +QueryDSL lets you write queries such as the following: [source,java] ---- @@ -467,9 +475,9 @@ Page page = repository.findAll(person.lastname.contains("a"), PageRequest.of(0, 2, Direction.ASC, "lastname")); ---- -`QPerson` is a class that is generated (via the Java annotation post processing tool) which is a `Predicate` that allows you to write type safe queries. Notice that there are no strings in the query other than the value "C0123". +`QPerson` is a class that is generated by the Java annotation post-processing tool. It is a `Predicate` that lets you write type-safe queries. Notice that there are no strings in the query other than the `C0123` value. -You can use the generated `Predicate` class via the interface `QueryDslPredicateExecutor` which is shown below +You can use the generated `Predicate` class by using the `QueryDslPredicateExecutor` interface, which the following listing shows: [source,java] ---- @@ -487,7 +495,7 @@ public interface QueryDslPredicateExecutor { } ---- -To use this in your repository implementation, simply inherit from it in addition to other repository interfaces. This is shown below +To use this in your repository implementation, add it to the list of repository interfaces from which your interface inherits, as the following example shows: [source,java] ---- @@ -497,13 +505,12 @@ public interface PersonRepository extends MongoRepository, Query } ---- -We think you will find this an extremely powerful tool for writing MongoDB queries. - [[mongodb.repositories.queries.full-text]] -=== Full-text search queries -MongoDBs full text search feature is very store specific and therefore can rather be found on `MongoRepository` than on the more general `CrudRepository`. What we need is a document with a full-text index defined for (Please see section <> for creating). +=== Full-text Search Queries -Additional methods on `MongoRepository` take `TextCriteria` as input parameter. In addition to those explicit methods, it is also possible to add a `TextCriteria` derived repository method. The criteria will be added as an additional `AND` criteria. Once the entity contains a `@TextScore` annotated property the documents full-text score will be retrieved. Furthermore the `@TextScore` annotated property will also make it possible to sort by the documents score. +MongoDB's full-text search feature is store-specific and, therefore, can be found on `MongoRepository` rather than on the more general `CrudRepository`. We need a document with a full-text index (see "`<>`" to learn how to create a full-text index). + +Additional methods on `MongoRepository` take `TextCriteria` as an input parameter. In addition to those explicit methods, it is also possible to add a `TextCriteria`-derived repository method. The criteria are added as an additional `AND` criteria. Once the entity contains a `@TextScore`-annotated property, the document's full-text score can be retrieved. Furthermore, the `@TextScore` annotated also makes it possible to sort by the document's score, as the following example shows: [source, java] ---- @@ -540,13 +547,10 @@ List result = repository.findByTitleOrderByScoreDesc("mongodb" include::../{spring-data-commons-docs}/repository-projections.adoc[leveloffset=+2] -[[mongodb.repositories.misc]] -== Miscellaneous - [[mongodb.repositories.misc.cdi-integration]] -=== CDI Integration +== CDI Integration -Instances of the repository interfaces are usually created by a container, which Spring is the most natural choice when working with Spring Data. As of version 1.3.0 Spring Data MongoDB ships with a custom CDI extension that allows using the repository abstraction in CDI environments. The extension is part of the JAR so all you need to do to activate it is dropping the Spring Data MongoDB JAR into your classpath. You can now set up the infrastructure by implementing a CDI Producer for the `MongoTemplate`: +Instances of the repository interfaces are usually created by a container, and Spring is the most natural choice when working with Spring Data. As of version 1.3.0, Spring Data MongoDB ships with a custom CDI extension that lets you use the repository abstraction in CDI environments. The extension is part of the JAR. To activate it, drop the Spring Data MongoDB JAR into your classpath. You can now set up the infrastructure by implementing a CDI Producer for the `MongoTemplate`, as the following example shows: [source,java] ---- @@ -562,7 +566,7 @@ class MongoTemplateProducer { } ---- -The Spring Data MongoDB CDI extension will pick up the `MongoTemplate` available as CDI bean and create a proxy for a Spring Data repository whenever a bean of a repository type is requested by the container. Thus obtaining an instance of a Spring Data repository is a matter of declaring an `@Inject`-ed property: +The Spring Data MongoDB CDI extension picks up the `MongoTemplate` available as a CDI bean and creates a proxy for a Spring Data repository whenever a bean of a repository type is requested by the container. Thus, obtaining an instance of a Spring Data repository is a matter of declaring an `@Inject`-ed property, as the following example shows: [source,java] ---- diff --git a/src/main/asciidoc/reference/mongodb.adoc b/src/main/asciidoc/reference/mongodb.adoc index b9ba446cf..c71ec3fee 100644 --- a/src/main/asciidoc/reference/mongodb.adoc +++ b/src/main/asciidoc/reference/mongodb.adoc @@ -1,33 +1,34 @@ [[mongo.core]] = MongoDB support -The MongoDB support contains a wide range of features which are summarized below. +The MongoDB support contains a wide range of features: -* Spring configuration support using Java based @Configuration classes or an XML namespace for a Mongo driver instance and replica sets -* MongoTemplate helper class that increases productivity performing common Mongo operations. Includes integrated object mapping between documents and POJOs. -* Exception translation into Spring's portable Data Access Exception hierarchy -* Feature Rich Object Mapping integrated with Spring's Conversion Service -* Annotation based mapping metadata but extensible to support other metadata formats -* Persistence and mapping lifecycle events -* Java based Query, Criteria, and Update DSLs -* Automatic implementation of Repository interfaces including support for custom finder methods. +* Spring configuration support with Java-based @Configuration classes or an XML namespace for a Mongo driver instance and replica sets. +* `MongoTemplate` helper class that increases productivity when performing common Mongo operations. Includes integrated object mapping between documents and POJOs. +* Exception translation into Spring's portable Data Access Exception hierarchy. +* Feature-rich Object Mapping integrated with Spring's Conversion Service. +* Annotation-based mapping metadata that is extensible to support other metadata formats. +* Persistence and mapping lifecycle events. +* Java-based Query, Criteria, and Update DSLs. +* Automatic implementation of Repository interfaces, including support for custom finder methods. * QueryDSL integration to support type-safe queries. -* Cross-store persistence - support for JPA Entities with fields transparently persisted/retrieved using MongoDB (deprecated - will be removed without replacement) -* GeoSpatial integration +* Cross-store persistence support for JPA Entities with fields transparently persisted and retrieved with MongoDB (deprecated - to be removed without replacement). +* GeoSpatial integration. -For most tasks you will find yourself using `MongoTemplate` or the Repository support that both leverage the rich mapping functionality. `MongoTemplate` is the place to look for accessing functionality such as incrementing counters or ad-hoc CRUD operations. `MongoTemplate` also provides callback methods so that it is easy for you to get a hold of the low level API artifacts such as `com.mongo.DB` to communicate directly with MongoDB. The goal with naming conventions on various API artifacts is to copy those in the base MongoDB Java driver so you can easily map your existing knowledge onto the Spring APIs. +For most tasks, you should use `MongoTemplate` or the Repository support, which both leverage the rich mapping functionality. `MongoTemplate` is the place to look for accessing functionality such as incrementing counters or ad-hoc CRUD operations. `MongoTemplate` also provides callback methods so that it is easy for you to get the low-level API artifacts, such as `com.mongo.DB`, to communicate directly with MongoDB. The goal with naming conventions on various API artifacts is to copy those in the base MongoDB Java driver so you can easily map your existing knowledge onto the Spring APIs. [[mongodb-getting-started]] == Getting Started -Spring MongoDB support requires MongoDB 2.6 or higher and Java SE 8 or higher. An easy way to bootstrap setting up a working environment is to create a Spring based project in http://spring.io/tools/sts[STS]. +Spring MongoDB support requires MongoDB 2.6 or higher and Java SE 8 or higher. An easy way to bootstrap setting up a working environment is to create a Spring-based project in http://spring.io/tools/sts[STS]. -First you need to set up a running Mongodb server. Refer to the http://docs.mongodb.org/manual/core/introduction/[Mongodb Quick Start guide] for an explanation on how to startup a MongoDB instance. Once installed starting MongoDB is typically a matter of executing the following command: `MONGO_HOME/bin/mongod` +First, you need to set up a running Mongodb server. Refer to the http://docs.mongodb.org/manual/core/introduction/[Mongodb Quick Start guide] for an explanation on how to startup a MongoDB instance. Once installed, starting MongoDB is typically a matter of running the following command: `MONGO_HOME/bin/mongod` -To create a Spring project in STS go to File -> New -> Spring Template Project -> Simple Spring Utility Project -> press Yes when prompted. Then enter a project and a package name such as org.spring.mongodb.example. - -Then add the following to pom.xml dependencies section. +To create a Spring project in STS: +. Go to File -> New -> Spring Template Project -> Simple Spring Utility Project, and press Yes when prompted. Then enter a project and a package name, such as `org.spring.mongodb.example`. +.Add the following to the pom.xml files `dependencies` element: ++ [source,xml] ---- @@ -42,16 +43,14 @@ Then add the following to pom.xml dependencies section. ---- - -Also change the version of Spring in the pom.xml to be - +. Change the version of Spring in the pom.xml to be ++ [source,xml] ---- {springVersion} ---- - -You will also need to add the location of the Spring Milestone repository for maven to your `pom.xml` which is at the same level of your `` element - +. Add the following location of the Spring Milestone repository for Maven to your `pom.xml` such that it is at the same level of your `` element: ++ [source,xml] ---- @@ -65,7 +64,7 @@ You will also need to add the location of the Spring Milestone repository for ma The repository is also http://repo.spring.io/milestone/org/springframework/data/[browseable here]. -You may also want to set the logging level to `DEBUG` to see some additional information, edit the `log4j.properties` file to have +You may also want to set the logging level to `DEBUG` to see some additional information. To do so, edit the `log4j.properties` file to have the following content: [source] ---- @@ -73,7 +72,7 @@ log4j.category.org.springframework.data.mongodb=DEBUG log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %40.40c:%4L - %m%n ---- -Create a simple Person class to persist: +Then you can create a `Person` class to persist: [source,java] ---- @@ -107,7 +106,7 @@ public class Person { } ---- -And a main application to run +You also need a main application to run: [source,java] ---- @@ -139,7 +138,7 @@ public class MongoApp { } ---- -This will produce the following output +When you run the main program, the preceding examples produce the following output: [source] ---- @@ -150,32 +149,32 @@ This will produce the following output 10:01:32,984 DEBUG ramework.data.mongodb.core.MongoTemplate: 375 - Dropped collection [database.person] ---- -Even in this simple example, there are few things to take notice of +Even in this simple example, there are few things to notice: -* You can instantiate the central helper class of Spring Mongo, <>, using the standard `com.mongodb.MongoClient` object and the name of the database to use. +* You can instantiate the central helper class of Spring Mongo, <>, by using the standard `com.mongodb.MongoClient` object and the name of the database to use. * The mapper works against standard POJO objects without the need for any additional metadata (though you can optionally provide that information. See <>.). -* Conventions are used for handling the id field, converting it to be a `ObjectId` when stored in the database. -* Mapping conventions can use field access. Notice the Person class has only getters. -* If the constructor argument names match the field names of the stored document, they will be used to instantiate the object +* Conventions are used for handling the `id` field, converting it to be an `ObjectId` when stored in the database. +* Mapping conventions can use field access. Notice that the `Person` class has only getters. +* If the constructor argument names match the field names of the stored document, they are used to instantiate the object [[mongo.examples-repo]] == Examples Repository -There is an https://github.com/spring-projects/spring-data-examples[github repository with several examples] that you can download and play around with to get a feel for how the library works. +There is a https://github.com/spring-projects/spring-data-examples[Github repository with several examples] that you can download and play around with to get a feel for how the library works. [[mongodb-connectors]] == Connecting to MongoDB with Spring -One of the first tasks when using MongoDB and Spring is to create a `com.mongodb.MongoClient` object using the IoC container. There are two main ways to do this, either using Java based bean metadata or XML based bean metadata. These are discussed in the following sections. +One of the first tasks when using MongoDB and Spring is to create a `com.mongodb.MongoClient` object using the IoC container. There are two main ways to do this, either by using Java-based bean metadata or by using XML-based bean metadata. Both are discussed in the following sections. -NOTE: For those not familiar with how to configure the Spring container using Java based bean metadata instead of XML based metadata see the high level introduction in the reference docs http://docs.spring.io/spring/docs/3.2.x/spring-framework-reference/html/new-in-3.0.html#new-java-configuration[here ] as well as the detailed documentation http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#beans-java-instantiating-container[ here]. +NOTE: For those not familiar with how to configure the Spring container using Java-based bean metadata instead of XML-based metadata, see the high-level introduction in the reference docs http://docs.spring.io/spring/docs/3.2.x/spring-framework-reference/html/new-in-3.0.html#new-java-configuration[here] as well as the detailed documentation http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#beans-java-instantiating-container[here]. [[mongo.mongo-java-config]] -=== Registering a Mongo instance using Java based metadata +=== Registering a Mongo Instance by using Java-based Metadata -An example of using Java based bean metadata to register an instance of a `com.mongodb.MongoClient` is shown below +The following example shows an example of using Java-based bean metadata to register an instance of a `com.mongodb.MongoClient`: -.Registering a com.mongodb.MongoClient object using Java based bean metadata +.Registering a `com.mongodb.MongoClient` object using Java-based bean metadata ==== [source,java] ---- @@ -192,11 +191,11 @@ public class AppConfig { ---- ==== -This approach allows you to use the standard `com.mongodb.MongoClient` instance with the container using Spring's `MongoClientFactoryBean`. As compared to instantiating a `com.mongodb.MongoClient` instance directly, the FactoryBean has the added advantage of also providing the container with an ExceptionTranslator implementation that translates MongoDB exceptions to exceptions in Spring's portable `DataAccessException` hierarchy for data access classes annotated with the `@Repository` annotation. This hierarchy and use of `@Repository` is described in http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html[Spring's DAO support features]. +This approach lets you use the standard `com.mongodb.MongoClient` instance, with the container using Spring's `MongoClientFactoryBean`. As compared to instantiating a `com.mongodb.MongoClient` instance directly, the FactoryBean has the added advantage of also providing the container with an `ExceptionTranslator` implementation that translates MongoDB exceptions to exceptions in Spring's portable `DataAccessException` hierarchy for data access classes annotated with the `@Repository` annotation. This hierarchy and the use of `@Repository` is described in http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html[Spring's DAO support features]. -An example of a Java based bean metadata that supports exception translation on `@Repository` annotated classes is shown below: +The following example shows an example of a Java-based bean metadata that supports exception translation on `@Repository` annotated classes: -.Registering a com.mongodb.MongoClient object using Spring's MongoClientFactoryBean and enabling Spring's exception translation support +.Registering a `com.mongodb.MongoClient` object by using Spring's MongoClientFactoryBean and enabling Spring's exception translation support ==== [source,java] ---- @@ -215,14 +214,14 @@ public class AppConfig { ---- ==== -To access the `com.mongodb.MongoClient` object created by the `MongoClientFactoryBean` in other `@Configuration` or your own classes, use a "`private @Autowired Mongo mongo;`" field. +To access the `com.mongodb.MongoClient` object created by the `MongoClientFactoryBean` in other `@Configuration` classes or your own classes, use a `private @Autowired Mongo mongo;` field. [[mongo.mongo-xml-config]] -=== Registering a Mongo instance using XML based metadata +=== Registering a Mongo Instance by Using XML-based Metadata -While you can use Spring's traditional `` XML namespace to register an instance of `com.mongodb.MongoClient` with the container, the XML can be quite verbose as it is general purpose. XML namespaces are a better alternative to configuring commonly used objects such as the Mongo instance. The mongo namespace allows you to create a Mongo instance server location, replica-sets, and options. +While you can use Spring's traditional `` XML namespace to register an instance of `com.mongodb.MongoClient` with the container, the XML can be quite verbose, as it is general-purpose. XML namespaces are a better alternative to configuring commonly used objects, such as the Mongo instance. The mongo namespace lets you create a Mongo instance server location, replica-sets, and options. -To use the Mongo namespace elements you will need to reference the Mongo schema: +To use the Mongo namespace elements, you need to reference the Mongo schema, as follows: .XML schema to configure MongoDB ==== @@ -247,7 +246,7 @@ To use the Mongo namespace elements you will need to reference the Mongo schema: ---- ==== -A more advanced configuration with `MongoClientOptions` is shown below (note these are not recommended values) +The following example shows a more advanced configuration with `MongoClientOptions` (note that these are not recommended values): .XML schema to configure a com.mongodb.MongoClient object with MongoClientOptions ==== @@ -273,9 +272,9 @@ A more advanced configuration with `MongoClientOptions` is shown below (note the ---- ==== -A configuration using replica sets is shown below. +The following example shows a configuration using replica sets: -.XML schema to configure com.mongodb.MongoClient object with Replica Sets +.XML schema to configure a `com.mongodb.MongoClient` object with Replica Sets ==== [source,xml] ---- @@ -284,9 +283,9 @@ A configuration using replica sets is shown below. ==== [[mongo.mongo-db-factory]] -=== The MongoDbFactory interface +=== The MongoDbFactory Interface -While `com.mongodb.MongoClient` is the entry point to the MongoDB driver API, connecting to a specific MongoDB database instance requires additional information such as the database name and an optional username and password. With that information you can obtain a com.mongodb.DB object and access all the functionality of a specific MongoDB database instance. Spring provides the `org.springframework.data.mongodb.core.MongoDbFactory` interface shown below to bootstrap connectivity to the database. +While `com.mongodb.MongoClient` is the entry point to the MongoDB driver API, connecting to a specific MongoDB database instance requires additional information, such as the database name and an optional username and password. With that information, you can obtain a `com.mongodb.DB` object and access all the functionality of a specific MongoDB database instance. Spring provides the `org.springframework.data.mongodb.core.MongoDbFactory` interface, shown in the following listing, to bootstrap connectivity to the database: [source,java] ---- @@ -298,9 +297,9 @@ public interface MongoDbFactory { } ---- -The following sections show how you can use the container with either Java or the XML based metadata to configure an instance of the `MongoDbFactory` interface. In turn, you can use the `MongoDbFactory` instance to configure `MongoTemplate`. +The following sections show how you can use the container with either Java-based or XML-based metadata to configure an instance of the `MongoDbFactory` interface. In turn, you can use the `MongoDbFactory` instance to configure `MongoTemplate`. -Instead of using the IoC container to create an instance of MongoTemplate, you can just use them in standard Java code as shown below. +Instead of using the IoC container to create an instance of MongoTemplate, you can use them in standard Java code, as follows: [source,java] ---- @@ -321,12 +320,12 @@ public class MongoApp { } ---- -The code in bold highlights the use of SimpleMongoDbFactory and is the only difference between the listing shown in the <>. +The code in bold highlights the use of `SimpleMongoDbFactory` and is the only difference between the listing shown in the <>. [[mongo.mongo-db-factory-java]] -=== Registering a MongoDbFactory instance using Java based metadata +=== Registering a `MongoDbFactory` Instance by Using Java-based Metadata -To register a MongoDbFactory instance with the container, you write code much like what was highlighted in the previous code listing. A simple example is shown below +To register a `MongoDbFactory` instance with the container, you write code much like what was highlighted in the previous code listing. The following listing shows a simple example: [source,java] ---- @@ -339,7 +338,7 @@ public class MongoConfiguration { } ---- -MongoDB Server generation 3 changed the authentication model when connecting to the DB. Therefore some of the configuration options available for authentication are no longer valid. Please use the `MongoClient` specific options for setting credentials via `MongoCredential` to provide authentication data. +MongoDB Server generation 3 changed the authentication model when connecting to the DB. Therefore, some of the configuration options available for authentication are no longer valid. You should use the `MongoClient`-specific options for setting credentials through `MongoCredential` to provide authentication data, as shown in the following example: [source,java] ---- @@ -360,24 +359,27 @@ public class ApplicationContextEventTestsAppConfig extends AbstractMongoConfigur } ---- -In order to use authentication with XML configuration use the `credentials` attribue on ``. +In order to use authentication with XML-based configuration, use the `credentials` attribute on the `` element. + +NOTE: Username and password credentials used in XML-based configuration must be URL-encoded when these contain reserved characters, such as `:`, `%`, `@`, or `,`. +The following example shows encoded credentials: + +`m0ng0@dmin:mo_res:bw6},Qsdxx@admin@database` -> `m0ng0%40dmin:mo_res%3Abw6%7D%2CQsdxx%40admin@database` -NOTE: Username/password credentials used in XML configuration must be URL encoded when these contain reserved characters such as `:`, `%`, `@`, `,`. -Example: `m0ng0@dmin:mo_res:bw6},Qsdxx@admin@database` -> `m0ng0%40dmin:mo_res%3Abw6%7D%2CQsdxx%40admin@database` See https://tools.ietf.org/html/rfc3986#section-2.2[section 2.2 of RFC 3986] for further details. [[mongo.mongo-db-factory-xml]] -=== Registering a MongoDbFactory instance using XML based metadata +=== Registering a `MongoDbFactory` Instance by Using XML-based Metadata -The mongo namespace provides a convenient way to create a `SimpleMongoDbFactory` as compared to using the `` namespace. Simple usage is shown below +The `mongo` namespace provides a convenient way to create a `SimpleMongoDbFactory`, as compared to using the `` namespace, as shown in the following example: [source,xml] ---- ---- -If you need to configure additional options on the `com.mongodb.MongoClient` instance that is used to create a `SimpleMongoDbFactory` you can refer to an existing bean using the `mongo-ref` attribute as shown below. To show another common usage pattern, this listing shows the use of a property placeholder to parametrise the configuration and creating `MongoTemplate`. +If you need to configure additional options on the `com.mongodb.MongoClient` instance that is used to create a `SimpleMongoDbFactory`, you can refer to an existing bean by using the `mongo-ref` attribute as shown in the following example. To show another common usage pattern, the following listing shows the use of a property placeholder, which lets you parametrize the configuration and the creation of a `MongoTemplate`: [source,xml] ---- @@ -406,30 +408,30 @@ If you need to configure additional options on the `com.mongodb.MongoClient` ins ---- [[mongo-template]] -== Introduction to MongoTemplate +== Introduction to `MongoTemplate` -The class `MongoTemplate`, located in the package `org.springframework.data.mongodb.core`, is the central class of the Spring's MongoDB support providing a rich feature set to interact with the database. The template offers convenience operations to create, update, delete and query for MongoDB documents and provides a mapping between your domain objects and MongoDB documents. +The `MongoTemplate` class, located in the `org.springframework.data.mongodb.core` package, is the central class of Spring's MongoDB support and provides a rich feature set for interacting with the database. The template offers convenience operations to create, update, delete, and query MongoDB documents and provides a mapping between your domain objects and MongoDB documents. NOTE: Once configured, `MongoTemplate` is thread-safe and can be reused across multiple instances. -The mapping between MongoDB documents and domain classes is done by delegating to an implementation of the interface `MongoConverter`. Spring provides the `MappingMongoConverter`, but you can also write your own converter. Please refer to the section on MongoConverters for more detailed information. +The mapping between MongoDB documents and domain classes is done by delegating to an implementation of the `MongoConverter` interface. Spring provides `MappingMongoConverter`, but you can also write your own converter. See "`<>`" for more detailed information. -The `MongoTemplate` class implements the interface `MongoOperations`. In as much as possible, the methods on `MongoOperations` are named after methods available on the MongoDB driver `Collection` object to make the API familiar to existing MongoDB developers who are used to the driver API. For example, you will find methods such as "find", "findAndModify", "findOne", "insert", "remove", "save", "update" and "updateMulti". The design goal was to make it as easy as possible to transition between the use of the base MongoDB driver and `MongoOperations`. A major difference in between the two APIs is that MongoOperations can be passed domain objects instead of `Document` and there are fluent APIs for `Query`, `Criteria`, and `Update` operations instead of populating a `Document` to specify the parameters for those operations. +The `MongoTemplate` class implements the interface `MongoOperations`. In as much as possible, the methods on `MongoOperations` are named after methods available on the MongoDB driver `Collection` object, to make the API familiar to existing MongoDB developers who are used to the driver API. For example, you can find methods such as `find`, `findAndModify`, `findOne`, `insert`, `remove`, `save`, `update`, and `updateMulti`. The design goal was to make it as easy as possible to transition between the use of the base MongoDB driver and `MongoOperations`. A major difference between the two APIs is that `MongoOperations` can be passed domain objects instead of `Document`. Also, `MongoOperations` has fluent APIs for `Query`, `Criteria`, and `Update` operations instead of populating a `Document` to specify the parameters for those operations. -NOTE: The preferred way to reference the operations on `MongoTemplate` instance is via its interface `MongoOperations`. +NOTE: The preferred way to reference the operations on `MongoTemplate` instance is through its interface, `MongoOperations`. -The default converter implementation used by `MongoTemplate` is MappingMongoConverter. While the `MappingMongoConverter` can make use of additional metadata to specify the mapping of objects to documents it is also capable of converting objects that contain no additional metadata by using some conventions for the mapping of IDs and collection names. These conventions as well as the use of mapping annotations is explained in the <>. +The default converter implementation used by `MongoTemplate` is `MappingMongoConverter`. While the `MappingMongoConverter` can use additional metadata to specify the mapping of objects to documents, it can also convert objects that contain no additional metadata by using some conventions for the mapping of IDs and collection names. These conventions, as well as the use of mapping annotations, are explained in the "`<`"> chapter. -Another central feature of MongoTemplate is exception translation of exceptions thrown in the MongoDB Java driver into Spring's portable Data Access Exception hierarchy. Refer to the section on <> for more information. +Another central feature of `MongoTemplate` is translation of exceptions thrown by the MongoDB Java driver into Spring's portable Data Access Exception hierarchy. See "`<>`" for more information. -While there are many convenience methods on `MongoTemplate` to help you easily perform common tasks if you should need to access the MongoDB driver API directly to access functionality not explicitly exposed by the MongoTemplate you can use one of several Execute callback methods to access underlying driver APIs. The execute callbacks will give you a reference to either a `com.mongodb.Collection` or a `com.mongodb.DB` object. Please see the section mongo.executioncallback[Execution Callbacks] for more information. +`MongoTemplate` offers many convenience methods to help you easily perform common tasks. However, if you need to directly access the MongoDB driver API, you can use one of several `Execute` callback methods. The execute callbacks gives you a reference to either a `com.mongodb.Collection` or a `com.mongodb.DB` object. See the mongo.executioncallback["`Execution Callbacks`"] section for more information. -Now let's look at an example of how to work with the `MongoTemplate` in the context of the Spring container. +The next section contains an example of how to work with the `MongoTemplate` in the context of the Spring container. [[mongo-template.instantiating]] === Instantiating MongoTemplate -You can use Java to create and register an instance of `MongoTemplate` as shown below. +You can use Java to create and register an instance of `MongoTemplate`, as the following example shows: .Registering a com.mongodb.MongoClient object and enabling Spring's exception translation support ==== @@ -449,13 +451,13 @@ public class AppConfig { ---- ==== -There are several overloaded constructors of MongoTemplate. These are +There are several overloaded constructors of `MongoTemplate`: -* `MongoTemplate(MongoClient mongo, String databaseName)` - takes the `com.mongodb.MongoClient` object and the default database name to operate against. -* `MongoTemplate(MongoDbFactory mongoDbFactory)` - takes a MongoDbFactory object that encapsulated the `com.mongodb.MongoClient` object, database name, and username and password. -* `MongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter mongoConverter)` - adds a MongoConverter to use for mapping. +* `MongoTemplate(MongoClient mongo, String databaseName)`: Takes the `com.mongodb.MongoClient` object and the default database name to operate against. +* `MongoTemplate(MongoDbFactory mongoDbFactory)`: Takes a MongoDbFactory object that encapsulated the `com.mongodb.MongoClient` object, database name, and username and password. +* `MongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter mongoConverter)`: Adds a `MongoConverter` to use for mapping. -You can also configure a MongoTemplate using Spring's XML schema. +You can also configure a MongoTemplate by using Spring's XML schema, as the following example shows: [source,java] ---- @@ -467,24 +469,24 @@ You can also configure a MongoTemplate using Spring's XML schema. ---- -Other optional properties that you might like to set when creating a `MongoTemplate` are the default `WriteResultCheckingPolicy`, `WriteConcern`, and `ReadPreference`. +Other optional properties that you might like to set when creating a `MongoTemplate` are the default `WriteResultCheckingPolicy`, `WriteConcern`, and `ReadPreference` properties. -NOTE: The preferred way to reference the operations on `MongoTemplate` instance is via its interface `MongoOperations`. +NOTE: The preferred way to reference the operations on `MongoTemplate` instance is through its interface, `MongoOperations`. [[mongo-template.writeresultchecking]] -=== WriteResultChecking Policy +=== `WriteResultChecking` Policy -When in development it is very handy to either log or throw an exception if the `com.mongodb.WriteResult` returned from any MongoDB operation contains an error. It is quite common to forget to do this during development and then end up with an application that looks like it runs successfully but in fact the database was not modified according to your expectations. Set MongoTemplate's property to an enum with the following values, `EXCEPTION`, or `NONE` to either throw an Exception or do nothing. The default is to use a `WriteResultChecking` value of `NONE`. +When in development, it is handy to either log or throw an exception if the `com.mongodb.WriteResult` returned from any MongoDB operation contains an error. It is quite common to forget to do this during development and then end up with an application that looks like it runs successfully when, in fact, the database was not modified according to your expectations. You can set the `WriteResultChecking` property of `MongoTemplate` to one of the following values: `EXCEPTION` or `NONE`, to either throw an `Exception` or do nothing, respectively. The default is to use a `WriteResultChecking` value of `NONE`. [[mongo-template.writeconcern]] -=== WriteConcern +=== `WriteConcern` -You can set the `com.mongodb.WriteConcern` property that the `MongoTemplate` will use for write operations if it has not yet been specified via the driver at a higher level such as `com.mongodb.MongoClient`. If MongoTemplate's `WriteConcern` property is not set it will default to the one set in the MongoDB driver's DB or Collection setting. +If it has not yet been specified through the driver at a higher level (such as `com.mongodb.MongoClient`), you can set the `com.mongodb.WriteConcern` property that the `MongoTemplate` uses for write operations. If the `WriteConcern` property is not set, it defaults to the one set in the MongoDB driver's DB or Collection setting. [[mongo-template.writeconcernresolver]] -=== WriteConcernResolver +=== `WriteConcernResolver` -For more advanced cases where you want to set different `WriteConcern` values on a per-operation basis (for remove, update, insert and save operations), a strategy interface called `WriteConcernResolver` can be configured on `MongoTemplate`. Since `MongoTemplate` is used to persist POJOs, the `WriteConcernResolver` lets you create a policy that can map a specific POJO class to a `WriteConcern` value. The `WriteConcernResolver` interface is shown below. +For more advanced cases where you want to set different `WriteConcern` values on a per-operation basis (for remove, update, insert, and save operations), a strategy interface called `WriteConcernResolver` can be configured on `MongoTemplate`. Since `MongoTemplate` is used to persist POJOs, the `WriteConcernResolver` lets you create a policy that can map a specific POJO class to a `WriteConcern` value. The following listing shows the `WriteConcernResolver` interface: [source,java] ---- @@ -493,7 +495,7 @@ public interface WriteConcernResolver { } ---- -The passed in argument, MongoAction, is what you use to determine the `WriteConcern` value to be used or to use the value of the Template itself as a default. `MongoAction` contains the collection name being written to, the `java.lang.Class` of the POJO, the converted `Document`, as well as the operation as an enumeration (`MongoActionOperation`: REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE) and a few other pieces of contextual information. For example, +You can use the `MongoAction` argument to determine the `WriteConcern` value or use the value of the Template itself as a default. `MongoAction` contains the collection name being written to, the `java.lang.Class` of the POJO, the converted `Document`, the operation (`REMOVE`, `UPDATE`, `INSERT`, `INSERT_LIST`, or `SAVE`), and a few other pieces of contextual information. The following example shows two sets of classes getting different `WriteConcern` settings: [source] ---- @@ -513,9 +515,9 @@ private class MyAppWriteConcernResolver implements WriteConcernResolver { [[mongo-template.save-update-remove]] == Saving, Updating, and Removing Documents -`MongoTemplate` provides a simple way for you to save, update, and delete your domain objects and map those objects to documents stored in MongoDB. +`MongoTemplate` provides lets you save, update, and delete your domain objects and map those objects to documents stored in MongoDB. -Given a simple class such as Person +Consider the following class: [source,java] ---- @@ -548,7 +550,7 @@ public class Person { } ---- -You can save, update and delete the object as shown below. +Given the `Person` class in the preceding example, you can save, update and delete the object, as the following example shows: NOTE: `MongoOperations` is the interface that `MongoTemplate` implements. @@ -606,7 +608,7 @@ public class MongoApp { } ---- -This would produce the following log output (including debug messages from `MongoTemplate` itself) +The preceding example would produce the following log output (including debug messages from `MongoTemplate`): [source] ---- @@ -623,37 +625,35 @@ INFO org.spring.example.MongoApp: 46 - Number of people = : 0 DEBUG work.data.mongodb.core.MongoTemplate: 376 - Dropped collection [database.person] ---- -There was implicit conversion using the `MongoConverter` between a `String` and `ObjectId` as stored in the database and recognizing a convention of the property "Id" name. +`MongoConverter` caused implicit conversion between a `String` and an `ObjectId` stored in the database by recognizing (through convention) the `Id` property name. -NOTE: This example is meant to show the use of save, update and remove operations on MongoTemplate and not to show complex mapping functionality +NOTE: The preceding example is meant to show the use of save, update, and remove operations on `MongoTemplate` and not to show complex mapping functionality. -The query syntax used in the example is explained in more detail in the section <>. +The query syntax used in the preceding example is explained in more detail in the section "`<>`". [[mongo-template.id-handling]] -=== How the `_id` field is handled in the mapping layer +=== How the `_id` Field is Handled in the Mapping Layer -MongoDB requires that you have an `_id` field for all documents. If you don't provide one the driver will assign a `ObjectId` with a generated value. When using the `MappingMongoConverter` there are certain rules that govern how properties from the Java class is mapped to this `_id` field. +MongoDB requires that you have an `_id` field for all documents. If you do not provide one, the driver assigns an `ObjectId` with a generated value. When you use the `MappingMongoConverter`, certain rules govern how properties from the Java class are mapped to this `_id` field: -The following outlines what property will be mapped to the `_id` document field: +. A property or field annotated with `@Id` (`org.springframework.data.annotation.Id`) maps to the `_id` field. +. A property or field without an annotation but named `id` maps to the `_id` field. -* A property or field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the `_id` field. -* A property or field without an annotation but named `id` will be mapped to the `_id` field. +The following outlines what type conversion, if any, is done on the property mapped to the `_id` document field when using the `MappingMongoConverter` (the default for `MongoTemplate`). -The following outlines what type conversion, if any, will be done on the property mapped to the _id document field when using the `MappingMongoConverter`, the default for `MongoTemplate`. +. If possible, an `id` property or field declared as a `String` in the Java class is converted to and stored as an `ObjectId` by using a Spring `Converter`. Valid conversion rules are delegated to the MongoDB Java driver. If it cannot be converted to an `ObjectId`, then the value is stored as a string in the database. +. An `id` property or field declared as `BigInteger` in the Java class is converted to and stored as an `ObjectId` by using a Spring `Converter`. -* An id property or field declared as a String in the Java class will be converted to and stored as an `ObjectId` if possible using a Spring `Converter`. Valid conversion rules are delegated to the MongoDB Java driver. If it cannot be converted to an ObjectId, then the value will be stored as a string in the database. -* An id property or field declared as `BigInteger` in the Java class will be converted to and stored as an `ObjectId` using a Spring `Converter`. +If no field or property specified in the previous sets of rules is present in the Java class, an implicit `_id` file is generated by the driver but not mapped to a property or field of the Java class. -If no field or property specified above is present in the Java class then an implicit `_id` file will be generated by the driver but not mapped to a property or field of the Java class. - -When querying and updating `MongoTemplate` will use the converter to handle conversions of the `Query` and `Update` objects that correspond to the above rules for saving documents so field names and types used in your queries will be able to match what is in your domain classes. +When querying and updating, `MongoTemplate` uses the converter that corresponds to the preceding rules for saving documents so that field names and types used in your queries can match what is in your domain classes. [[mongo-template.type-mapping]] -=== Type mapping +=== Type Mapping -As MongoDB collections can contain documents that represent instances of a variety of types. A great example here is if you store a hierarchy of classes or simply have a class with a property of type `Object`. In the latter case the values held inside that property have to be read in correctly when retrieving the object. Thus we need a mechanism to store type information alongside the actual document. +MongoDB collections can contain documents that represent instances of a variety of types. This feature can be useful if you store a hierarchy of classes or have a class with a property of type `Object`. In the latter case, the values held inside that property have to be read in correctly when retrieving the object. Thus, we need a mechanism to store type information alongside the actual document. -To achieve that the `MappingMongoConverter` uses a `MongoTypeMapper` abstraction with `DefaultMongoTypeMapper` as it's main implementation. Its default behavior is storing the fully qualified classname under `_class` inside the document. Type hints are written for top-level documents as well as for every value if it's a complex type and a subtype of the property type declared. +To achieve that, the `MappingMongoConverter` uses a `MongoTypeMapper` abstraction with `DefaultMongoTypeMapper` as its main implementation. Its default behavior to store the fully qualified classname under `_class` inside the document. Type hints are written for top-level documents as well as for every value (if it is a complex type and a subtype of the declared property type). The following example (with a JSON representation at the end) shows how the mapping works: .Type mapping @@ -680,13 +680,13 @@ mongoTemplate.save(sample); ---- ==== -As you can see we store the type information as last field for the actual root class as well as for the nested type as it is complex and a subtype of `Contact`. So if you're now using `mongoTemplate.findAll(Object.class, "sample")` we are able to find out that the document stored shall be a `Sample` instance. We are also able to find out that the value property shall be a `Person` actually. +Spring Data MongoDB stores the type information as the last field for the actual root class as well as for the nested type (because it is complex and a subtype of `Contact`). So, if you now use `mongoTemplate.findAll(Object.class, "sample")`, you can find out that the document stored is a `Sample` instance. You can also find out that the value property is actually a `Person`. -==== Customizing type mapping +==== Customizing Type Mapping -In case you want to avoid writing the entire Java class name as type information but rather like to use some key you can use the `@TypeAlias` annotation at the entity class being persisted. If you need to customize the mapping even more have a look at the `TypeInformationMapper` interface. An instance of that interface can be configured at the `DefaultMongoTypeMapper` which can be configured in turn on `MappingMongoConverter`. +If you want to avoid writing the entire Java class name as type information but would rather like to use a key, you can use the `@TypeAlias` annotation on the entity class. If you need to customize the mapping even more, have a look at the `TypeInformationMapper` interface. An instance of that interface can be configured at the `DefaultMongoTypeMapper`, which can, in turn, be configured on `MappingMongoConverter`. The following example shows how to define a type alias for an entity: -.Defining a TypeAlias for an Entity +.Defining a type alias for an Entity ==== [source,java] ---- @@ -697,13 +697,13 @@ class Person { ---- ==== -Note that the resulting document will contain `"pers"` as the value in the `_class` Field. +Note that the resulting document contains `pers` as the value in the `_class` Field. -==== Configuring custom type mapping +==== Configuring Custom Type Mapping -The following example demonstrates how to configure a custom `MongoTypeMapper` in `MappingMongoConverter`. +The following example shows how to configure a custom `MongoTypeMapper` in `MappingMongoConverter`: -.Configuring a custom MongoTypeMapper via Spring Java Config +.Configuring a custom `MongoTypeMapper` with Spring Java Config ==== [source,java] ---- @@ -711,7 +711,7 @@ class CustomMongoTypeMapper extends DefaultMongoTypeMapper { //implement custom type mapping here } ---- -==== + [source,java] ---- @@ -742,10 +742,13 @@ class SampleMongoConfiguration extends AbstractMongoConfiguration { } } ---- +==== -Note that we are extending the `AbstractMongoConfiguration` class and override the bean definition of the `MappingMongoConverter` where we configure our custom `MongoTypeMapper`. +Note that the preceding example extends the `AbstractMongoConfiguration` class and overrides the bean definition of the `MappingMongoConverter` where we configured our custom `MongoTypeMapper`. -.Configuring a custom MongoTypeMapper via XML +The following example shows how to use XML to configure a custom `MongoTypeMapper`: + +.Configuring a custom `MongoTypeMapper` with XML ==== [source,xml] ---- @@ -756,17 +759,17 @@ Note that we are extending the `AbstractMongoConfiguration` class and override t ==== [[mongo-template.save-insert]] -=== Methods for saving and inserting documents +=== Methods for Saving and Inserting Documents -There are several convenient methods on `MongoTemplate` for saving and inserting your objects. To have more fine-grained control over the conversion process you can register Spring converters with the `MappingMongoConverter`, for example `Converter` and `Converter`. +There are several convenient methods on `MongoTemplate` for saving and inserting your objects. To have more fine-grained control over the conversion process, you can register Spring converters with the `MappingMongoConverter` -- for example `Converter` and `Converter`. -NOTE: The difference between insert and save operations is that a save operation will perform an insert if the object is not already present. +NOTE: The difference between insert and save operations is that a save operation performs an insert if the object is not already present. -The simple case of using the save operation is to save a POJO. In this case the collection name will be determined by name (not fully qualified) of the class. You may also call the save operation with a specific collection name. The collection to store the object can be overridden using mapping metadata. +The simple case of using the save operation is to save a POJO. In this case, the collection name is determined by name (not fully qualified) of the class. You may also call the save operation with a specific collection name. You can use mapping metadata to override the collection in which to store the object. -When inserting or saving, if the Id property is not set, the assumption is that its value will be auto-generated by the database. As such, for auto-generation of an ObjectId to succeed the type of the Id property/field in your class must be either a `String`, `ObjectId`, or `BigInteger`. +When inserting or saving, if the `Id` property is not set, the assumption is that its value will be auto-generated by the database. Consequently, for auto-generation of an `ObjectId` to succeed, the type of the `Id` property or field in your class must be a `String`, an `ObjectId`, or a `BigInteger`. -Here is a basic example of using the save operation and retrieving its contents. +The following example shows how to save a document and retrieving its contents: .Inserting and retrieving documents using the MongoTemplate ==== @@ -783,43 +786,43 @@ Person qp = mongoTemplate.findOne(query(where("age").is(33)), Person.class); ---- ==== -The insert/save operations available to you are listed below. +The following insert and save operations are available: -* `void` *save* `(Object objectToSave)` Save the object to the default collection. -* `void` *save* `(Object objectToSave, String collectionName)` Save the object to the specified collection. +* `void` *save* `(Object objectToSave)`: Save the object to the default collection. +* `void` *save* `(Object objectToSave, String collectionName)`: Save the object to the specified collection. -A similar set of insert operations is listed below +A similar set of insert operations is also available: -* `void` *insert* `(Object objectToSave)` Insert the object to the default collection. -* `void` *insert* `(Object objectToSave, String collectionName)` Insert the object to the specified collection. +* `void` *insert* `(Object objectToSave)`: Insert the object to the default collection. +* `void` *insert* `(Object objectToSave, String collectionName)`: Insert the object to the specified collection. [[mongo-template.save-insert.collection]] -==== Which collection will my documents be saved into? +==== Into Which Collection Are My Documents Saved? -There are two ways to manage the collection name that is used for operating on the documents. The default collection name that is used is the class name changed to start with a lower-case letter. So a `com.test.Person` class would be stored in the "person" collection. You can customize this by providing a different collection name using the @Document annotation. You can also override the collection name by providing your own collection name as the last parameter for the selected MongoTemplate method calls. +There are two ways to manage the collection name that is used for the documents. The default collection name that is used is the class name changed to start with a lower-case letter. So a `com.test.Person` class is stored in the `person` collection. You can customize this by providing a different collection name with the `@Document` annotation. You can also override the collection name by providing your own collection name as the last parameter for the selected `MongoTemplate` method calls. [[mongo-template.save-insert.individual]] -==== Inserting or saving individual objects +==== Inserting or Saving Individual Objects -The MongoDB driver supports inserting a collection of documents in one operation. The methods in the MongoOperations interface that support this functionality are listed below +The MongoDB driver supports inserting a collection of documents in a single operation. The following methods in the `MongoOperations` interface support this functionality: -* *insert* inserts an object. If there is an existing document with the same id then an error is generated. -* *insertAll* takes a `Collection` of objects as the first parameter. This method inspects each object and inserts it to the appropriate collection based on the rules specified above. -* *save* saves the object overwriting any object that might exist with the same id. +* *insert*: Inserts an object. If there is an existing document with the same `id`, an error is generated. +* *insertAll*: Takes a `Collection` of objects as the first parameter. This method inspects each object and inserts it into the appropriate collection, based on the rules specified earlier. +* *save*: Saves the object, overwriting any object that might have the same `id`. [[mongo-template.save-insert.batch]] -==== Inserting several objects in a batch +==== Inserting Several Objects in a Batch -The MongoDB driver supports inserting a collection of documents in one operation. The methods in the MongoOperations interface that support this functionality are listed below +The MongoDB driver supports inserting a collection of documents in one operation. The following methods in the `MongoOperations` interface support this functionality: -* *insert* methods that take a `Collection` as the first argument. This inserts a list of objects in a single batch write to the database. +* *insert* methods: Take a `Collection` as the first argument. They insert a list of objects in a single batch write to the database. [[mongodb-template-update]] -=== Updating documents in a collection +=== Updating Documents in a Collection -For updates we can elect to update the first document found using `MongoOperation` 's method `updateFirst` or we can update all documents that were found to match the query using the method `updateMulti`. Here is an example of an update of all SAVINGS accounts where we are adding a one-time $50.00 bonus to the balance using the `$inc` operator. +For updates, you can update the first document found by using `MongoOperation.updateFirst` or you can update all documents that were found to match the query by using the `MongoOperation.updateMulti` method. The following example shows an update of all `SAVINGS` accounts where we are adding a one-time $50.00 bonus to the balance by using the `$inc` operator: -.Updating documents using the MongoTemplate +.Updating documents by using the `MongoTemplate` ==== [source,java] ---- @@ -834,22 +837,22 @@ WriteResult wr = mongoTemplate.updateMulti(new Query(where("accounts.accountType ---- ==== -In addition to the `Query` discussed above we provide the update definition using an `Update` object. The `Update` class has methods that match the update modifiers available for MongoDB. +In addition to the `Query` discussed earlier, we provide the update definition by using an `Update` object. The `Update` class has methods that match the update modifiers available for MongoDB. -As you can see most methods return the `Update` object to provide a fluent style for the API. +Most methods return the `Update` object to provide a fluent style for the API. [[mongodb-template-update.methods]] -==== Methods for executing updates for documents +==== Methods for Executing Updates for Documents -* *updateFirst* Updates the first document that matches the query document criteria with the provided updated document. -* *updateMulti* Updates all objects that match the query document criteria with the provided updated document. +* *updateFirst*: Updates the first document that matches the query document criteria with the updated document. +* *updateMulti*: Updates all objects that match the query document criteria with the updated document. [[mongodb-template-update.update]] -==== Methods for the Update class +==== Methods in the `Update` Class -The Update class can be used with a little 'syntax sugar' as its methods are meant to be chained together and you can kick-start the creation of a new Update instance via the static method `public static Update update(String key, Object value)` and using static imports. +You can use a little "'syntax sugar'" with the `Update` class, as its methods are meant to be chained together. Also, you can kick-start the creation of a new `Update` instance by using `public static Update update(String key, Object value)` and using static imports. -Here is a listing of methods on the Update class +The `Update` class contains the following methods: * `Update` *addToSet* `(String key, Object value)` Update using the `$addToSet` update modifier * `Update` *currentDate* `(String key)` Update using the `$currentDate` update modifier @@ -868,7 +871,7 @@ Here is a listing of methods on the Update class * `Update` *setOnInsert* `(String key, Object value)` Update using the `$setOnInsert` update modifier * `Update` *unset* `(String key)` Update using the `$unset` update modifier -Some update modifiers like `$push` and `$addToSet` allow nesting of additional operators. +Some update modifiers, such as `$push` and `$addToSet`, allow nesting of additional operators. [source] ---- @@ -880,18 +883,15 @@ new Update().push("key").atPosition(Position.FIRST).each(Arrays.asList("Arya", " // { $push : { "key" : { "$slice" : 5 , "$each" : [ "Arya" , "Arry" , "Weasel" ] } } } new Update().push("key").slice(5).each(Arrays.asList("Arya", "Arry", "Weasel")); ----- -[source] ----- // { $addToSet : { "values" : { "$each" : [ "spring" , "data" , "mongodb" ] } } } new Update().addToSet("values").each("spring", "data", "mongodb"); ---- [[mongo-template.upserts]] -=== Upserting documents in a collection +=== "`Upserting`" Documents in a Collection -Related to performing an `updateFirst` operations, you can also perform an upsert operation which will perform an insert if no document is found that matches the query. The document that is inserted is a combination of the query document and the update document. Here is an example +Related to performing an `updateFirst` operation, you can also perform an "`upsert`" operation, which will perform an insert if no document is found that matches the query. The document that is inserted is a combination of the query document and the update document. The following example shows how to use the `upsert` method: [source] ---- @@ -899,9 +899,9 @@ template.upsert(query(where("ssn").is(1111).and("firstName").is("Joe").and("Frai ---- [[mongo-template.find-and-upsert]] -=== Finding and Upserting documents in a collection +=== Finding and Upserting Documents in a Collection -The `findAndModify(…)` method on DBCollection can update a document and return either the old or newly updated document in a single operation. `MongoTemplate` provides a findAndModify method that takes `Query` and `Update` classes and converts from `Document` to your POJOs. Here are the methods +The `findAndModify(…)` method on `DBCollection` can update a document and return either the old or newly updated document in a single operation. `MongoTemplate` provides four `findAndModify` overloaded methods that take `Query` and `Update` classes and converts from `Document` to your POJOs: [source,java] ---- @@ -914,7 +914,7 @@ The `findAndModify(…)` method on DBCollection can update a document and return T findAndModify(Query query, Update update, FindAndModifyOptions options, Class entityClass, String collectionName); ---- -As an example usage, we will insert of few `Person` objects into the container and perform a simple findAndUpdate operation +The following example inserts a few `Person` objects into the container and performs a `findAndUpdate` operation: [source,java] ---- @@ -936,7 +936,7 @@ p = template.findAndModify(query, update, new FindAndModifyOptions().returnNew(t assertThat(p.getAge(), is(25)); ---- -The `FindAndModifyOptions` lets you set the options of returnNew, upsert, and remove. An example extending off the previous code snippet is shown below +The `FindAndModifyOptions` method lets you set the options of `returnNew`, `upsert`, and `remove`. An example extending from the previous code snippet follows: [source,java] ---- @@ -947,9 +947,9 @@ assertThat(p.getAge(), is(1)); ---- [[mongo-template.delete]] -=== Methods for removing documents +=== Methods for Removing Documents -You can use several overloaded methods to remove an object from the database. +You can use one of five overloaded methods to remove an object from the database: ==== [source,java] @@ -964,17 +964,17 @@ template.findAllAndRemove(query(where("lastname").is("lannister"), "GOT"); <4> template.findAllAndRemove(new Query().limit(3), "GOT"); <5> ---- -<1> Remove a single entity via its `_id` from the associated collection. -<2> Remove all documents matching the criteria of the query from the `GOT` collection. -<3> Remove the first 3 documents in the `GOT` collection. Unlike <2>, the documents to remove are identified via their `_id` executing the given query applying `sort`, `limit` and `skip` options first and then remove all at once in a separate step. +<1> Remove a single entity specified by its `_id` from the associated collection. +<2> Remove all documents that match the criteria of the query from the `GOT` collection. +<3> Remove the first three documents in the `GOT` collection. Unlike <2>, the documents to remove are identified by their `_id`, executing the given query, applying `sort`, `limit`, and `skip` options first, and then removing all at once in a separate step. <4> Remove all documents matching the criteria of the query from the `GOT` collection. Unlike <3>, documents do not get deleted in a batch but one by one. -<5> Remove the first 3 documents in the `GOT` collection. Unlike <3>, documents do not get deleted in a batch but one by one. +<5> Remove the first three documents in the `GOT` collection. Unlike <3>, documents do not get deleted in a batch but one by one. ==== [[mongo-template.optimistic-locking]] -=== Optimistic locking +=== Optimistic Locking -The `@Version` annotation provides a JPA similar semantic in the context of MongoDB and makes sure updates are only applied to documents with matching version. Therefore the actual value of the version property is added to the update query in a way that the update won't have any effect if another operation altered the document in between. In that case an `OptimisticLockingFailureException` is thrown. +The `@Version` annotation provides syntax similar to that of JPA in the context of MongoDB and makes sure updates are only applied to documents with a matching version. Therefore, the actual value of the version property is added to the update query in such a way that the update does not have any effect if another operation altered the document in the meantime. In that case, an `OptimisticLockingFailureException` is thrown. The following example shows these features: ==== [source,java] @@ -988,19 +988,19 @@ class Person { @Version Long version; } -Person daenerys = template.insert(new Person("Daenerys")); <1> +Person daenerys = template.insert(new Person("Daenerys")); <1> -Person tmp = teplate.findOne(query(where("id").is(daenerys.getId())), Person.class); <2> +Person tmp = template.findOne(query(where("id").is(daenerys.getId())), Person.class); <2> daenerys.setLastname("Targaryen"); -template.save(daenerys); <3> +template.save(daenerys); <3> -template.save(tmp); // throws OptimisticLockingFailureException <4> +template.save(tmp); // throws OptimisticLockingFailureException <4> ---- <1> Intially insert document. `version` is set to `0`. -<2> Load the just inserted document `version` is still `0`. -<3> Update document with `version = 0`. Set the `lastname` and bump `version` to `1`. -<4> Try to update previously loaded document sill having `version = 0` fails with `OptimisticLockingFailureException` as the current `version` is `1`. +<2> Load the just inserted document. `version` is still `0`. +<3> Update the document with `version = 0`. Set the `lastname` and bump `version` to `1`. +<4> Try to update the previously loaded document that still has `version = 0`. The operation fails with an `OptimisticLockingFailureException`, as the current `version` is `1`. ==== IMPORTANT: Using MongoDB driver version 3 requires to set the `WriteConcern` to `ACKNOWLEDGED`. Otherwise `OptimisticLockingFailureException` can be silently swallowed. @@ -1008,7 +1008,7 @@ IMPORTANT: Using MongoDB driver version 3 requires to set the `WriteConcern` to [[mongo.query]] == Querying Documents -You can express your queries using the `Query` and `Criteria` classes which have method names that mirror the native MongoDB operator names such as `lt`, `lte`, `is`, and others. The `Query` and `Criteria` classes follow a fluent API style so that you can easily chain together multiple method criteria and queries while having easy to understand the code. Static imports in Java are used to help remove the need to see the 'new' keyword for creating `Query` and `Criteria` instances so as to improve readability. If you like to create `Query` instances from a plain JSON String use `BasicQuery`. +You can use the `Query` and `Criteria` classes to express your queries. They have method names that mirror the native MongoDB operator names, such as `lt`, `lte`, `is`, and others. The `Query` and `Criteria` classes follow a fluent API style so that you can chain together multiple method criteria and queries while having easy-to-understand code. To improve readability, static imports let you avoid using the 'new' keyword for creating `Query` and `Criteria` instances. You can also use `BasicQuery` to create `Query` instances from plain JSON Strings, as shown in the following example: .Creating a Query instance from a plain JSON String ==== @@ -1019,14 +1019,12 @@ List result = mongoTemplate.find(query, Person.class); ---- ==== -GeoSpatial queries are also supported and are described more in the section <>. - -Map-Reduce operations are also supported and are described more in the section <>. +Spring MongoDB also supports GeoSpatial queries (see the <> section) and Map-Reduce operations (see the <> section.). [[mongodb-template-query]] -=== Querying documents in a collection +=== Querying Documents in a Collection -We saw how to retrieve a single document using the findOne and findById methods on MongoTemplate in previous sections which return a single domain object. We can also query for a collection of documents to be returned as a list of domain objects. Assuming that we have a number of Person objects with name and age stored as documents in a collection and that each person has an embedded account document with a balance. We can now run a query using the following code. +Earlier, we saw how to retrieve a single document by using the `findOne` and `findById` methods on `MongoTemplate`. These methods return a single domain object. We can also query for a collection of documents to be returned as a list of domain objects. Assuming that we have a number of `Person` objects with name and age stored as documents in a collection and that each person has an embedded account document with a balance, we can now run a query using the following code: .Querying for documents using the MongoTemplate ==== @@ -1042,14 +1040,14 @@ List result = mongoTemplate.find(query(where("age").lt(50) ---- ==== -All find methods take a `Query` object as a parameter. This object defines the criteria and options used to perform the query. The criteria is specified using a `Criteria` object that has a static factory method named `where` used to instantiate a new `Criteria` object. We recommend using a static import for `org.springframework.data.mongodb.core.query.Criteria.where` and `Query.query` to make the query more readable. +All find methods take a `Query` object as a parameter. This object defines the criteria and options used to perform the query. The criteria are specified by using a `Criteria` object that has a static factory method named `where` to instantiate a new `Criteria` object. We recommend using static imports for `org.springframework.data.mongodb.core.query.Criteria.where` and `Query.query` to make the query more readable. -This query should return a list of `Person` objects that meet the specified criteria. The `Criteria` class has the following methods that correspond to the operators provided in MongoDB. - -As you can see most methods return the `Criteria` object to provide a fluent style for the API. +The query should return a list of `Person` objects that meet the specified criteria. The rest of this section lists the methods of the `Criteria` and `Query` classes that correspond to the operators provided in MongoDB. Most methods return the `Criteria` object, to provide a fluent style for the API. [[mongodb-template-query.criteria]] -==== Methods for the Criteria class +==== Methods for the Criteria Class + +The `Criteria` class provides the following methods, all of which correspond to operators in MongoDB: * `Criteria` *all* `(Object o)` Creates a criterion using the `$all` operator * `Criteria` *and* `(String key)` Adds a chained `Criteria` with the specified `key` to the current `Criteria` and returns the newly created one @@ -1073,7 +1071,7 @@ As you can see most methods return the `Criteria` object to provide a fluent sty * `Criteria` *size* `(int s)` Creates a criterion using the `$size` operator * `Criteria` *type* `(int t)` Creates a criterion using the `$type` operator -There are also methods on the Criteria class for geospatial queries. Here is a listing but look at the section on <> to see them in action. +The Criteria class also provides the following methods for geospatial queries (see the <> section to see them in action): * `Criteria` *within* `(Circle circle)` Creates a geospatial criterion using `$geoWithin $center` operators. * `Criteria` *within* `(Box box)` Creates a geospatial criterion using a `$geoWithin $box` operation. @@ -1083,11 +1081,12 @@ There are also methods on the Criteria class for geospatial queries. Here is a l * `Criteria` *minDistance* `(double minDistance)` Creates a geospatial criterion using the `$minDistance` operation, for use with $near. * `Criteria` *maxDistance* `(double maxDistance)` Creates a geospatial criterion using the `$maxDistance` operation, for use with $near. -The `Query` class has some additional methods used to provide options for the query. [[mongodb-template-query.query]] ==== Methods for the Query class +The `Query` class has some additional methods that provide options for the query: + * `Query` *addCriteria* `(Criteria criteria)` used to add additional criteria to the query * `Field` *fields* `()` used to define fields to be included in the query results * `Query` *limit* `(int limit)` used to limit the size of the returned results to the provided limit (used for paging) @@ -1095,22 +1094,22 @@ The `Query` class has some additional methods used to provide options for the qu * `Query` *with* `(Sort sort)` used to provide sort definition for the results [[mongo-template.querying]] -=== Methods for querying for documents +=== Methods for Querying for Documents -The query methods need to specify the target type T that will be returned and they are also overloaded with an explicit collection name for queries that should operate on a collection other than the one indicated by the return type. +The query methods need to specify the target type T that is returned, and they are overloaded with an explicit collection name for queries that should operate on a collection other than the one indicated by the return type. The following query methods let you find one or more documents: -* *findAll* Query for a list of objects of type T from the collection. -* *findOne* Map the results of an ad-hoc query on the collection to a single instance of an object of the specified type. -* *findById* Return an object of the given id and target class. -* *find* Map the results of an ad-hoc query on the collection to a List of the specified type. -* *findAndRemove* Map the results of an ad-hoc query on the collection to a single instance of an object of the specified type. The first document that matches the query is returned and also removed from the collection in the database. +* *findAll*: Query for a list of objects of type T from the collection. +* *findOne*: Map the results of an ad-hoc query on the collection to a single instance of an object of the specified type. +* *findById*: Return an object of the given ID and target class. +* *find*: Map the results of an ad-hoc query on the collection to a `List` of the specified type. +* *findAndRemove*: Map the results of an ad-hoc query on the collection to a single instance of an object of the specified type. The first document that matches the query is returned and removed from the collection in the database. [[mongo.geospatial]] === GeoSpatial Queries -MongoDB supports GeoSpatial queries through the use of operators such as `$near`, `$within`, `geoWithin` and `$nearSphere`. Methods specific to geospatial queries are available on the `Criteria` class. There are also a few shape classes, `Box`, `Circle`, and `Point` that are used in conjunction with geospatial related `Criteria` methods. +MongoDB supports GeoSpatial queries through the use of operators such as `$near`, `$within`, `geoWithin`, and `$nearSphere`. Methods specific to geospatial queries are available on the `Criteria` class. There are also a few shape classes (`Box`, `Circle`, and `Point`) that are used in conjunction with geospatial related `Criteria` methods. -To understand how to perform GeoSpatial queries we will use the following Venue class taken from the integration tests which relies on using the rich `MappingMongoConverter`. +To understand how to perform GeoSpatial queries, consider the following `Venue` class (taken from the integration tests and relying on the rich `MappingMongoConverter`): [source,java] ---- @@ -1151,7 +1150,7 @@ public class Venue { } ---- -To find locations within a `Circle`, the following query can be used. +To find locations within a `Circle`, you can use the following query: [source,java] ---- @@ -1160,7 +1159,7 @@ List venues = template.find(new Query(Criteria.where("location").within(circle)), Venue.class); ---- -To find venues within a `Circle` using spherical coordinates the following query can be used +To find venues within a `Circle` using spherical coordinates, you can use the following query: [source,java] ---- @@ -1169,7 +1168,7 @@ List venues = template.find(new Query(Criteria.where("location").withinSphere(circle)), Venue.class); ---- -To find venues within a `Box` the following query can be used +To find venues within a `Box`, you can use the following query: [source,java] ---- @@ -1179,7 +1178,7 @@ List venues = template.find(new Query(Criteria.where("location").within(box)), Venue.class); ---- -To find venues near a `Point`, the following queries can be used +To find venues near a `Point`, you can use the following queries: [source,java] ---- @@ -1195,7 +1194,7 @@ List venues = template.find(new Query(Criteria.where("location").near(point).minDistance(0.01).maxDistance(100)), Venue.class); ---- -To find venues near a `Point` using spherical coordinates the following query can be used +To find venues near a `Point` using spherical coordinates, you can use the following query: [source,java] ---- @@ -1207,9 +1206,9 @@ List venues = ---- [[mongo.geo-near]] -==== Geo near queries +==== Geo-near Queries -MongoDB supports querying the database for geo locations and calculation the distance from a given origin at the very same time. With geo-near queries it's possible to express queries like: "find all restaurants in the surrounding 10 miles". To do so `MongoOperations` provides `geoNear(…)` methods taking a `NearQuery` as argument as well as the already familiar entity type and collection +MongoDB supports querying the database for geo locations and calculating the distance from a given origin at the same time. With geo-near queries, you can express queries such as "find all restaurants in the surrounding 10 miles". To let you do so, `MongoOperations` provides `geoNear(…)` methods that take a `NearQuery` as an argument (as well as the already familiar entity type and collection), as shown in the following example: [source,java] ---- @@ -1219,20 +1218,18 @@ NearQuery query = NearQuery.near(location).maxDistance(new Distance(10, Metrics. GeoResults = operations.geoNear(query, Restaurant.class); ---- -As you can see we use the `NearQuery` builder API to set up a query to return all `Restaurant` instances surrounding the given `Point` by 10 miles maximum. The `Metrics` enum used here actually implements an interface so that other metrics could be plugged into a distance as well. A `Metric` is backed by a multiplier to transform the distance value of the given metric into native distances. The sample shown here would consider the 10 to be miles. Using one of the pre-built in metrics (miles and kilometers) will automatically trigger the spherical flag to be set on the query. If you want to avoid that, simply hand in plain `double` values into `maxDistance(…)`. For more information see the JavaDoc of `NearQuery` and `Distance`. +We use the `NearQuery` builder API to set up a query to return all `Restaurant` instances surrounding the given `Point` out to 10 miles. The `Metrics` enum used here actually implements an interface so that other metrics could be plugged into a distance as well. A `Metric` is backed by a multiplier to transform the distance value of the given metric into native distances. The sample shown here would consider the 10 to be miles. Using one of the built-in metrics (miles and kilometers) automatically triggers the spherical flag to be set on the query. If you want to avoid that, pass plain `double` values into `maxDistance(…)`. For more information, see the https://docs.spring.io/spring-data/mongodb/docs/current/api/index.html[JavaDoc] of `NearQuery` and `Distance`. -The geo near operations return a `GeoResults` wrapper object that encapsulates `GeoResult` instances. The wrapping `GeoResults` allows accessing the average distance of all results. A single `GeoResult` object simply carries the entity found plus its distance from the origin. +The geo-near operations return a `GeoResults` wrapper object that encapsulates `GeoResult` instances. Wrapping `GeoResults` allows accessing the average distance of all results. A single `GeoResult` object carries the entity found plus its distance from the origin. [[mongo.geo-json]] === GeoJSON Support -MongoDB supports http://geojson.org/[GeoJSON] and simple (legacy) coordinate pairs for geospatial data. Those formats can both be used for storing as well as querying data. +MongoDB supports http://geojson.org/[GeoJSON] and simple (legacy) coordinate pairs for geospatial data. Those formats can both be used for storing as well as querying data. See the http://docs.mongodb.org/manual/core/2dsphere/#geospatial-indexes-store-geojson/[MongoDB manual on GeoJSON support] to learn about requirements and restrictions. -NOTE: Please refer to the http://docs.mongodb.org/manual/core/2dsphere/#geospatial-indexes-store-geojson/[MongoDB manual on GeoJSON support] to learn about requirements and restrictions. +==== GeoJSON Types in Domain Classes -==== GeoJSON types in domain classes - -Usage of http://geojson.org/[GeoJSON] types in domain classes is straight forward. The `org.springframework.data.mongodb.core.geo` package contains types like `GeoJsonPoint`, `GeoJsonPolygon` and others. Those are extensions to the existing `org.springframework.data.geo` types. +Usage of http://geojson.org/[GeoJSON] types in domain classes is straightforward. The `org.springframework.data.mongodb.core.geo` package contains types such as `GeoJsonPoint`, `GeoJsonPolygon`, and others. These types are extend the existing `org.springframework.data.geo` types. The following example uses a `GeoJsonPoint`: ==== [source,java] @@ -1253,9 +1250,9 @@ public class Store { ---- ==== -==== GeoJSON types in repository query methods +==== GeoJSON Types in Repository Query Methods -Using GeoJSON types as repository query parameters forces usage of the `$geometry` operator when creating the query. +Using GeoJSON types as repository query parameters forces usage of the `$geometry` operator when creating the query, as the following example shows: ==== [source,java] @@ -1307,20 +1304,20 @@ repo.findByLocationWithin( <4> new Point(-73.961138, 40.760348), new Point(-73.991658, 40.730006)); ---- -<1> Repository method definition using the commons type allows calling it with both GeoJSON and legacy format. -<2> Use GeoJSON type the make use of `$geometry` operator. -<3> Plase note that GeoJSON polygons need the define a closed ring. -<4> Use legacy format `$polygon` operator. +<1> Repository method definition using the commons type allows calling it with both the GeoJSON and the legacy format. +<2> Use GeoJSON type to make use of `$geometry` operator. +<3> Note that GeoJSON polygons need to define a closed ring. +<4> Use the legacy format `$polygon` operator. ==== [[mongo.textsearch]] -=== Full Text Queries +=== Full-text Queries -Since MongoDB 2.6 full text queries can be executed using the `$text` operator. Methods and operations specific for full text queries are available in `TextQuery` and `TextCriteria`. When doing full text search please refer to the http://docs.mongodb.org/manual/reference/operator/query/text/#behavior[MongoDB reference] for its behavior and limitations. +Since version 2.6 of MongoDB, you can run full-text queries by using the `$text` operator. Methods and operations specific to full-text queries are available in `TextQuery` and `TextCriteria`. When doing full text search, see the http://docs.mongodb.org/manual/reference/operator/query/text/#behavior[MongoDB reference] for its behavior and limitations. -==== Full Text Search +==== Full-text Search -Before we are actually able to use full text search we have to ensure to set up the search index correctly. Please refer to section <> for creating index structures. +Before you can actually use full-text search, you must set up the search index correctly. See <> for more detail on how to create index structures. The following example shows how to set up a full-text search: [source,javascript] ---- @@ -1337,7 +1334,7 @@ db.foo.createIndex( ) ---- -A query searching for `coffee cake`, sorted by relevance according to the `weights` can be defined and executed as: +A query searching for `coffee cake`, sorted by relevance according to the `weights`, can be defined and executed as follows: [source,java] ---- @@ -1345,7 +1342,7 @@ Query query = TextQuery.searching(new TextCriteria().matchingAny("coffee", "cake List page = template.find(query, Document.class); ---- -Exclusion of search terms can directly be done by prefixing the term with `-` or using `notMatching` +You can exclude search terms by prefixing the term with `-` or by using `notMatching`, as shown in the following example (note that the two lines have the same effect and are thus redundant): [source,java] ---- @@ -1354,7 +1351,7 @@ TextQuery.searching(new TextCriteria().matching("coffee").matching("-cake")); TextQuery.searching(new TextCriteria().matching("coffee").notMatching("cake")); ---- -As `TextCriteria.matching` takes the provided term as is. Therefore phrases can be defined by putting them between double quotes (eg. `\"coffee cake\")` or using `TextCriteria.phrase.` +`TextCriteria.matching` takes the provided term as is. Therefore, you can define phrases by putting them between double quotation marks (for example, `\"coffee cake\")` or using by `TextCriteria.phrase.` The following example shows both ways of defining a phrase: [source,java] ---- @@ -1363,12 +1360,12 @@ TextQuery.searching(new TextCriteria().matching("\"coffee cake\"")); TextQuery.searching(new TextCriteria().phrase("coffee cake")); ---- -The flags for `$caseSensitive` and `$diacriticSensitive` can be set via the according methods on `TextCriteria`. Please note that these two optional flags have been introduced in MongoDB 3.2 and will not be included in the query unless explicitly set. +You can set flags for `$caseSensitive` and `$diacriticSensitive` by using the corresponding methods on `TextCriteria`. Note that these two optional flags have been introduced in MongoDB 3.2 and are not included in the query unless explicitly set. [[mongo.collation]] === Collations -MongoDB supports since 3.4 collations for collection and index creation and various query operations. Collations define string comparison rules based on the http://userguide.icu-project.org/collation/concepts[ICU collations]. A collation document consists of various properties that are encapsulated in `Collation`: +Since version 3.4, MongoDB supports collations for collection and index creation and various query operations. Collations define string comparison rules based on the http://userguide.icu-project.org/collation/concepts[ICU collations]. A collation document consists of various properties that are encapsulated in `Collation`, as the following listing shows: ==== [source,java] @@ -1386,15 +1383,15 @@ Collation collation = Collation.of("fr") <1> .normalizationEnabled(); <6> ---- -<1> `Collation` requires a locale for creation. This can be either a string representation of the locale, a `Locale` (considering language, country and variant) or a `CollationLocale`. The locale is mandatory for creation. -<2> Collation strength defines comparison levels denoting differences between characters. You can configure various options (case-sensitivity, case-ordering) depending on the selected strength. +<1> `Collation` requires a locale for creation. This can be either a string representation of the locale, a `Locale` (considering language, country, and variant) or a `CollationLocale`. The locale is mandatory for creation. +<2> Collation strength defines comparison levels that denote differences between characters. You can configure various options (case-sensitivity, case-ordering, and others), depending on the selected strength. <3> Specify whether to compare numeric strings as numbers or as strings. <4> Specify whether the collation should consider whitespace and punctuation as base characters for purposes of comparison. <5> Specify whether strings with diacritics sort from back of the string, such as with some French dictionary ordering. -<6> Specify whether to check if text requires normalization and to perform normalization. +<6> Specify whether to check whether text requires normalization and whether to perform normalization. ==== -Collations can be used to create collections and indexes. If you create a collection specifying a collation, the collation is applied to index creation and queries unless you specify a different collation. A collation is valid for a whole operation and cannot be specified on a per-field basis. +Collations can be used to create collections and indexes. If you create a collection that specifies a collation, the collation is applied to index creation and queries unless you specify a different collation. A collation is valid for a whole operation and cannot be specified on a per-field basis, as the following example shows: [source,java] ---- @@ -1408,7 +1405,7 @@ template.indexOps(Person.class).ensureIndex(new Index("name", Direction.ASC).col NOTE: MongoDB uses simple binary comparison if no collation is specified (`Collation.simple()`). -Using collations with collection operations is a matter of specifying a `Collation` instance in your query or operation options. +Using collations with collection operations is a matter of specifying a `Collation` instance in your query or operation options, as the following two examples show: .Using collation with `find` ==== @@ -1441,16 +1438,16 @@ AggregationResults results = template.aggregate(aggregation, "tags", T ---- ==== -WARNING: Indexes are only used if the collation used for the operation and the index collation matches. +WARNING: Indexes are only used if the collation used for the operation matches the index collation. [[mongo.query.fluent-template-api]] === Fluent Template API -The `MongoOperations` interface is one of the central components when it comes to more low level interaction with MongoDB. It offers a wide range of methods covering needs from collection / index creation and CRUD operations to more advanced functionality like map-reduce and aggregations. -One can find multiple overloads for each and every method. Most of them just cover optional / nullable parts of the API. +The `MongoOperations` interface is one of the central components when it comes to more low-level interaction with MongoDB. It offers a wide range of methods covering needs from collection creation, index creation, and CRUD operations to more advanced functionality, such as Map-Reduce and aggregations. +You can find multiple overloads for each method. Most of them cover optional or nullable parts of the API. -`FluentMongoOperations` provide a more narrow interface for common methods of `MongoOperations` providing a more readable, fluent API. -The entry points `insert(…)`, `find(…)`, `update(…)`, etc. follow a natural naming schema based on the operation to execute. Moving on from the entry point the API is designed to only offer context dependent methods guiding towards a terminating method that invokes the actual `MongoOperations` counterpart. +`FluentMongoOperations` provides a more narrow interface for the common methods of `MongoOperations` and provides a more readable, fluent API. +The entry points (`insert(…)`, `find(…)`, `update(…)`, and others) follow a natural naming schema based on the operation to be run. Moving on from the entry point, the API is designed to offer only context-dependent methods that lead to a terminating method that invokes the actual `MongoOperations` counterpart -- the `all` method in the case of the following example: ==== [source,java] @@ -1459,11 +1456,11 @@ List all = ops.find(SWCharacter.class) .inCollection("star-wars") <1> .all(); ---- -<1> Skip this step if `SWCharacter` defines the collection via `@Document` or if using the class name as the collection name is just fine. +<1> Skip this step if `SWCharacter` defines the collection with `@Document` or if you use the class name as the collection name, which is fine. ==== -Sometimes a collection in MongoDB holds entities of different types. Like a `Jedi` within a collection of `SWCharacters`. -To use different types for `Query` and return value mapping one can use `as(Class targetType)` map results differently. +Sometimes, a collection in MongoDB holds entities of different types, such as a `Jedi` within a collection of `SWCharacters`. +To use different types for `Query` and return value mapping, you can use `as(Class targetType)` to map results differently, as the following example shows: ==== [source,java] @@ -1477,11 +1474,11 @@ List all = ops.find(SWCharacter.class) <1> <2> Resulting documents are mapped into `Jedi`. ==== -TIP: It is possible to directly apply <> to resulting documents by providing just the `interface` type via `as(Class)`. +TIP: You can directly apply <> to result documents by providing the `interface` type with `as(Class)`. -Switching between retrieving a single entity, multiple ones as `List` or `Stream` like is done via the terminating methods `first()`, `one()`, `all()` or `stream()`. +You can switch between retrieving a single entity and retrieving multiple entities as a `List` or a `Stream` through the terminating methods: `first()`, `one()`, `all()`, or `stream()`. -When writing a geo-spatial query via `near(NearQuery)` the number of terminating methods is altered to just the ones valid for executing a `geoNear` command in MongoDB fetching entities as `GeoResult` within `GeoResults`. +When writing a geo-spatial query with `near(NearQuery)`, the number of terminating methods is altered to include only the methods that are valid for executing a `geoNear` command in MongoDB (fetching entities as a `GeoResult` within `GeoResults`), as the following example shows: ==== [source,java] @@ -1499,14 +1496,14 @@ include::query-by-example.adoc[leveloffset=+1] [[mongo.mapreduce]] == Map-Reduce Operations -You can query MongoDB using Map-Reduce which is useful for batch processing, data aggregation, and for when the query language doesn't fulfill your needs. +You can query MongoDB by using Map-Reduce, which is useful for batch processing, for data aggregation, and for when the query language does not fulfill your needs. -Spring provides integration with MongoDB's map reduce by providing methods on MongoOperations to simplify the creation and execution of Map-Reduce operations. It can convert the results of a Map-Reduce operation to a POJO also integrates with Spring's http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#resources[Resource abstraction] abstraction. This will let you place your JavaScript files on the file system, classpath, http server or any other Spring Resource implementation and then reference the JavaScript resources via an easy URI style syntax, e.g. 'classpath:reduce.js;. Externalizing JavaScript code in files is often preferable to embedding them as Java strings in your code. Note that you can still pass JavaScript code as Java strings if you prefer. +Spring provides integration with MongoDB's Map-Reduce by providing methods on `MongoOperations` to simplify the creation and execution of Map-Reduce operations. It can convert the results of a Map-Reduce operation to a POJO and integrates with Spring's http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#resources[Resource abstraction]. This lets you place your JavaScript files on the file system, classpath, HTTP server, or any other Spring Resource implementation and then reference the JavaScript resources through an easy URI style syntax -- for example, `classpath:reduce.js;`. Externalizing JavaScript code in files is often preferable to embedding them as Java strings in your code. Note that you can still pass JavaScript code as Java strings if you prefer. [[mongo.mapreduce.example]] === Example Usage -To understand how to perform Map-Reduce operations an example from the book 'MongoDB - The definitive guide' is used. In this example we will create three documents that have the values [a,b], [b,c], and [c,d] respectfully. The values in each document are associated with the key 'x' as shown below. For this example assume these documents are in the collection named "jmr1". +To understand how to perform Map-Reduce operations, we use an example from the book, _MongoDB - The Definitive Guide_ footnote:[Kristina Chodorow. _MongoDB - The Definitive Guide_. O'Reilly Media, 2013]. In this example, we create three documents that have the values [a,b], [b,c], and [c,d], respectively. The values in each document are associated with the key, 'x', as the following example shows (assume these documents are in a collection named `jmr1`): [source] ---- @@ -1515,7 +1512,7 @@ To understand how to perform Map-Reduce operations an example from the book 'Mon { "_id" : ObjectId("4e5ff893c0277826074ec535"), "x" : [ "c", "d" ] } ---- -A map function that will count the occurrence of each letter in the array for each document is shown below +The following map function counts the occurrence of each letter in the array for each document: [source,java] ---- @@ -1526,7 +1523,7 @@ function () { } ---- -The reduce function that will sum up the occurrence of each letter across all the documents is shown below +The follwing reduce function sums up the occurrence of each letter across all the documents: [source,java] ---- @@ -1538,7 +1535,7 @@ function (key, values) { } ---- -Executing this will result in a collection as shown below. +Running the preceding functions result in the following collection: [source] ---- @@ -1548,7 +1545,7 @@ Executing this will result in a collection as shown below. { "_id" : "d", "value" : 1 } ---- -Assuming that the map and reduce functions are located in `map.js` and `reduce.js` and bundled in your jar so they are available on the classpath, you can execute a map-reduce operation and obtain the results as shown below +Assuming that the map and reduce functions are located in `map.js` and `reduce.js` and bundled in your jar so they are available on the classpath, you can run a Map-Reduce operation as follows: [source,java] ---- @@ -1558,7 +1555,7 @@ for (ValueObject valueObject : results) { } ---- -The output of the above code is +The preceding exmaple produces the following output: [source] ---- @@ -1568,7 +1565,7 @@ ValueObject [id=c, value=2.0] ValueObject [id=d, value=1.0] ---- -The MapReduceResults class implements `Iterable` and provides access to the raw output, as well as timing and count statistics. The `ValueObject` class is simply +The `MapReduceResults` class implements `Iterable` and provides access to the raw output and timing and count statistics. The following listing shows the `ValueObject` class: [source,java] ---- @@ -1596,7 +1593,7 @@ public class ValueObject { } ---- -By default the output type of INLINE is used so you don't have to specify an output collection. To specify additional map-reduce options use an overloaded method that takes an additional `MapReduceOptions` argument. The class `MapReduceOptions` has a fluent API so adding additional options can be done in a very compact syntax. Here an example that sets the output collection to "jmr1_out". Note that setting only the output collection assumes a default output type of REPLACE. +By default, the output type of INLINE is used so that you need not specify an output collection. To specify additional Map-Reduce options, use an overloaded method that takes an additional `MapReduceOptions` argument. The class `MapReduceOptions` has a fluent API, so adding additional options can be done in a compact syntax. The following example sets the output collection to `jmr1_out` (note that setting only the output collection assumes a default output type of `REPLACE`): [source,java] ---- @@ -1604,7 +1601,7 @@ MapReduceResults results = mongoOperations.mapReduce("jmr1", "class new MapReduceOptions().outputCollection("jmr1_out"), ValueObject.class); ---- -There is also a static import `import static org.springframework.data.mongodb.core.mapreduce.MapReduceOptions.options;` that can be used to make the syntax slightly more compact +There is also a static import (`import static org.springframework.data.mongodb.core.mapreduce.MapReduceOptions.options;`) that can be used to make the syntax slightly more compact, as the following example shows: [source,java] ---- @@ -1612,7 +1609,7 @@ MapReduceResults results = mongoOperations.mapReduce("jmr1", "class options().outputCollection("jmr1_out"), ValueObject.class); ---- -You can also specify a query to reduce the set of data that will be used to feed into the map-reduce operation. This will remove the document that contains [a,b] from consideration for map-reduce operations. +You can also specify a query to reduce the set of data that is fed into the Map-Reduce operation. The following example removes the document that contains [a,b] from consideration for Map-Reduce operations: [source,java] ---- @@ -1621,14 +1618,12 @@ MapReduceResults results = mongoOperations.mapReduce(query, "jmr1", options().outputCollection("jmr1_out"), ValueObject.class); ---- -Note that you can specify additional limit and sort values as well on the query but not skip values. +Note that you can specify additional limit and sort values on the query, but you cannot skip values. [[mongo.server-side-scripts]] == Script Operations -MongoDB allows executing JavaScript functions on the server by either directly sending the script or calling a stored one. `ScriptOperations` can be accessed via `MongoTemplate` and provides basic abstraction for `JavaScript` usage. - -=== Example Usage +MongoDB allows executing JavaScript functions on the server by either directly sending the script or calling a stored one. `ScriptOperations` can be accessed through `MongoTemplate` and provides basic abstraction for `JavaScript` usage. The following example shows how to us the `ScriptOperations` class: ==== [source,java] @@ -1648,7 +1643,7 @@ scriptOps.call("echo", "execute script via name"); <3> [[mongo.group]] == Group Operations - +{JB} As an alternative to using Map-Reduce to perform data aggregation, you can use the http://www.mongodb.org/display/DOCS/Aggregation#Aggregation-Group[`group` operation] which feels similar to using SQL's group by query style, so it may feel more approachable vs. using Map-Reduce. Using the group operations does have some limitations, for example it is not supported in a shared environment and it returns the full result set in a single BSON object, so the result should be small, less than 10,000 keys. Spring provides integration with MongoDB's group operation by providing methods on MongoOperations to simplify the creation and execution of group operations. It can convert the results of the group operation to a POJO and also integrates with Spring's http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#resources[Resource abstraction] abstraction. This will let you place your JavaScript files on the file system, classpath, http server or any other Spring Resource implementation and then reference the JavaScript resources via an easy URI style syntax, e.g. 'classpath:reduce.js;. Externalizing JavaScript code in files if often preferable to embedding them as Java strings in your code. Note that you can still pass JavaScript code as Java strings if you prefer. @@ -1746,31 +1741,29 @@ GroupByResults results = mongoTemplate.group(where("x").gt(0), Spring Data MongoDB provides support for the Aggregation Framework introduced to MongoDB in version 2.2. -The MongoDB Documentation describes the http://docs.mongodb.org/manual/core/aggregation/[Aggregation Framework] as follows: - -For further information see the full http://docs.mongodb.org/manual/aggregation/[reference documentation] of the aggregation framework and other data aggregation tools for MongoDB. +For further information, see the full http://docs.mongodb.org/manual/aggregation/[reference documentation] of the aggregation framework and other data aggregation tools for MongoDB. [[mongo.aggregation.basic-concepts]] === Basic Concepts -The Aggregation Framework support in Spring Data MongoDB is based on the following key abstractions `Aggregation`, `AggregationOperation` and `AggregationResults`. +The Aggregation Framework support in Spring Data MongoDB is based on the following key abstractions: `Aggregation`, `AggregationOperation`, and `AggregationResults`. * `Aggregation` + -An Aggregation represents a MongoDB `aggregate` operation and holds the description of the aggregation pipeline instructions. Aggregations are created by invoking the appropriate `newAggregation(…)` static factory Method of the `Aggregation` class which takes the list of `AggregateOperation` as a parameter next to the optional input class. +An `Aggregation` represents a MongoDB `aggregate` operation and holds the description of the aggregation pipeline instructions. Aggregations are created by invoking the appropriate `newAggregation(…)` static factory method of the `Aggregation` class, which takes a list of `AggregateOperation` and an optional input class. + -The actual aggregate operation is executed by the `aggregate` method of the `MongoTemplate` which also takes the desired output class as parameter. +The actual aggregate operation is executed by the `aggregate` method of the `MongoTemplate`, which takes the desired output class as a parameter. + * `AggregationOperation` + -An `AggregationOperation` represents a MongoDB aggregation pipeline operation and describes the processing that should be performed in this aggregation step. Although one could manually create an `AggregationOperation` the recommended way to construct an `AggregateOperation` is to use the static factory methods provided by the `Aggregate` class. +An `AggregationOperation` represents a MongoDB aggregation pipeline operation and describes the processing that should be performed in this aggregation step. Although you could manually create an `AggregationOperation`, we recommend using the static factory methods provided by the `Aggregate` class to construct an `AggregateOperation`. + * `AggregationResults` + -`AggregationResults` is the container for the result of an aggregate operation. It provides access to the raw aggregation result in the form of an `Document`, to the mapped objects and information which performed the aggregation. +`AggregationResults` is the container for the result of an aggregate operation. It provides access to the raw aggregation result, in the form of a `Document` to the mapped objects and other information about the aggregation. ++ +The following listing shows the canonical example for using the Spring Data MongoDB support for the MongoDB Aggregation Framework: + -The canonical example for using the Spring Data MongoDB support for the MongoDB Aggregation Framework looks as follows: - [source,java] ---- import static org.springframework.data.mongodb.core.aggregation.Aggregation.*; @@ -1785,12 +1778,12 @@ AggregationResults results = mongoTemplate.aggregate(agg, "INPUT_COL List mappedResult = results.getMappedResults(); ---- -Note that if you provide an input class as the first parameter to the `newAggregation` method the `MongoTemplate` will derive the name of the input collection from this class. Otherwise if you don't not specify an input class you must provide the name of the input collection explicitly. If an input-class and an input-collection is provided the latter takes precedence. +Note that, if you provide an input class as the first parameter to the `newAggregation` method, the `MongoTemplate` derives the name of the input collection from this class. Otherwise, if you do not not specify an input class, you must provide the name of the input collection explicitly. If both an input class and an input collection are provided, the latter takes precedence. [[mongo.aggregation.supported-aggregation-operations]] === Supported Aggregation Operations -The MongoDB Aggregation Framework provides the following types of Aggregation Operations: +The MongoDB Aggregation Framework provides the following types of aggregation operations: * Pipeline Aggregation Operators * Group Aggregation Operators @@ -1803,71 +1796,71 @@ The MongoDB Aggregation Framework provides the following types of Aggregation Op * Conditional Aggregation Operators * Lookup Aggregation Operators -At the time of this writing we provide support for the following Aggregation Operations in Spring Data MongoDB. +At the time of this writing, we provide support for the following Aggregation Operations in Spring Data MongoDB: .Aggregation Operations currently supported by Spring Data MongoDB [cols="2*"] |=== | Pipeline Aggregation Operators -| bucket, bucketAuto, count, facet, geoNear, graphLookup, group, limit, lookup, match, project, replaceRoot, skip, sort, unwind +| `bucket`, `bucketAuto`, `count`, `facet`, `geoNear`, `graphLookup`, `group`, `limit`, `lookup`, `match`, `project`, `replaceRoot`, `skip`, `sort`, `unwind` | Set Aggregation Operators -| setEquals, setIntersection, setUnion, setDifference, setIsSubset, anyElementTrue, allElementsTrue +| `setEquals`, `setIntersection`, `setUnion`, `setDifference`, `setIsSubset`, `anyElementTrue`, `allElementsTrue` | Group Aggregation Operators -| addToSet, first, last, max, min, avg, push, sum, (*count), stdDevPop, stdDevSamp +| `addToSet`, `first`, `last`, `max`, `min`, `avg`, `push`, `sum`, `(*count)`, `stdDevPop`, `stdDevSamp` | Arithmetic Aggregation Operators -| abs, add (*via plus), ceil, divide, exp, floor, ln, log, log10, mod, multiply, pow, sqrt, subtract (*via minus), trunc +| `abs`, `add` (*via `plus`), `ceil`, `divide`, `exp`, `floor`, `ln`, `log`, `log10`, `mod`, `multiply`, `pow`, `sqrt`, `subtract` (*via `minus`), `trunc` | String Aggregation Operators -| concat, substr, toLower, toUpper, stcasecmp, indexOfBytes, indexOfCP, split, strLenBytes, strLenCP, substrCP, +| `concat`, `substr`, `toLower`, `toUpper`, `stcasecmp`, `indexOfBytes`, `indexOfCP`, `split`, `strLenBytes`, `strLenCP`, `substrCP` | Comparison Aggregation Operators -| eq (*via: is), gt, gte, lt, lte, ne +| `eq` (*via: `is`), `gt`, `gte`, `lt`, `lte`, `ne` | Array Aggregation Operators -| arrayElementAt, concatArrays, filter, in, indexOfArray, isArray, range, reverseArray, reduce, size, slice, zip +| `arrayElementAt`, `concatArrays`, `filter`, `in`, `indexOfArray`, `isArray`, `range`, `reverseArray`, `reduce`, `size`, `slice`, `zip` | Literal Operators -| literal +| `literal` | Date Aggregation Operators -| dayOfYear, dayOfMonth, dayOfWeek, year, month, week, hour, minute, second, millisecond, dateToString, dateFromString, dateFromParts, dateToParts, isoDayOfWeek, isoWeek, isoWeekYear +| `dayOfYear`, `dayOfMonth`, `dayOfWeek`, `year`, `month`, `week`, `hour`, `minute`, `second`, `millisecond`, `dateToString`, `dateFromString`, `dateFromParts`, `dateToParts`, `isoDayOfWeek`, `isoWeek`, `isoWeekYear` | Variable Operators -| map +| `map` | Conditional Aggregation Operators -| cond, ifNull, switch +| `cond`, `ifNull`, `switch` | Type Aggregation Operators -| type +| `type` |=== -Note that the aggregation operations not listed here are currently not supported by Spring Data MongoDB. Comparison aggregation operators are expressed as `Criteria` expressions. +* The operation is mapped or added by Spring Data MongoDB. -*) The operation is mapped or added by Spring Data MongoDB. +Note that the aggregation operations not listed here are currently not supported by Spring Data MongoDB. Comparison aggregation operators are expressed as `Criteria` expressions. [[mongo.aggregation.projection]] === Projection Expressions -Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregation` class either by passing a list of ``String``'s or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API via the `and(String)` method and aliased via the `as(String)` method. -Note that one can also define fields with aliases via the static factory method `Fields.field` of the aggregation framework that can then be used to construct a new `Fields` instance. References to projected fields in later aggregation stages are only valid by using the field name of included fields or their alias of aliased or newly defined fields. Fields not included in the projection cannot be referenced in later aggregation stages. +Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined through the `project` method of the `Aggregation` class, either by passing a list of `String` objects or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API by using the `and(String)` method and aliased by using the `as(String)` method. +Note that you can also define fields with aliases by using the `Fields.field` static factory method of the aggregation framework, which you can then use to construct a new `Fields` instance. References to projected fields in later aggregation stages are valid only for the field names of included fields or their aliases (including newly defined fields and their aliases). Fields not included in the projection cannot be referenced in later aggregation stages. The following listings show examples of projection expression: .Projection expression examples ==== [source,java] ---- -// will generate {$project: {name: 1, netPrice: 1}} +// generates {$project: {name: 1, netPrice: 1}} project("name", "netPrice") -// will generate {$project: {bar: $foo}} -project().and("foo").as("bar") +// generates {$project: {thing1: $thing2}} +project().and("thing1").as("thing2") -// will generate {$project: {a: 1, b: 1, bar: $foo}} -project("a","b").and("foo").as("bar") +// generates {$project: {a: 1, b: 1, thing2: $thing1}} +project("a","b").and("thing1").as("thing2") ---- ==== @@ -1875,89 +1868,92 @@ project("a","b").and("foo").as("bar") ==== [source,java] ---- -// will generate {$project: {name: 1, netPrice: 1}}, {$sort: {name: 1}} +// generates {$project: {name: 1, netPrice: 1}}, {$sort: {name: 1}} project("name", "netPrice"), sort(ASC, "name") -// will generate {$project: {bar: $foo}}, {$sort: {bar: 1}} -project().and("foo").as("bar"), sort(ASC, "bar") +// generates {$project: {thing2: $thing1}}, {$sort: {thing2: 1}} +project().and("thing1").as("thing2"), sort(ASC, "thing2") -// this will not work -project().and("foo").as("bar"), sort(ASC, "foo") +// does not work +project().and("thing1").as("thing2"), sort(ASC, "thing1") ---- ==== More examples for project operations can be found in the `AggregationTests` class. Note that further details regarding the projection expressions can be found in the http://docs.mongodb.org/manual/reference/operator/aggregation/project/#pipe._S_project[corresponding section] of the MongoDB Aggregation Framework reference documentation. [[mongo.aggregation.facet]] -=== Faceted classification +=== Faceted Classification -MongoDB supports as of Version 3.4 faceted classification using the Aggregation Framework. A faceted classification uses semantic categories, either general or subject-specific, that are combined to create the full classification entry. Documents flowing through the aggregation pipeline are classificated into buckets. A multi-faceted classification enables various aggregations on the same set of input documents, without needing to retrieve the input documents multiple times. +As of Version 3.4, MongoDB supports faceted classification by using the Aggregation Framework. A faceted classification uses semantic categories (either general or subject-specific) that are combined to create the full classification entry. Documents flowing through the aggregation pipeline are classified into buckets. A multi-faceted classification enables various aggregations on the same set of input documents, without needing to retrieve the input documents multiple times. ==== Buckets -Bucket operations categorize incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. Bucket operations require a grouping field or grouping expression. They can be defined via the `bucket()`/`bucketAuto()` methods of the `Aggregate` class. `BucketOperation` and `BucketAutoOperation` can expose accumulations based on aggregation expressions for input documents. The bucket operation can be extended with additional parameters through a fluent API via the `with…()` methods, the `andOutput(String)` method and aliased via the `as(String)` method. Each bucket is represented as a document in the output. +Bucket operations categorize incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. Bucket operations require a grouping field or a grouping expression. You can define them by using the `bucket()` and `bucketAuto()` methods of the `Aggregate` class. `BucketOperation` and `BucketAutoOperation` can expose accumulations based on aggregation expressions for input documents. You can extend the bucket operation with additional parameters through a fluent API by using the `with…()` methods and the `andOutput(String)` method. You can alias alias the operation by using the `as(String)` method. Each bucket is represented as a document in the output. -`BucketOperation` takes a defined set of boundaries to group incoming documents into these categories. Boundaries are required to be sorted. +`BucketOperation` takes a defined set of boundaries to group incoming documents into these categories. Boundaries are required to be sorted. The following listing shows some examples of bucket operations: .Bucket operation examples ==== [source,java] ---- -// will generate {$bucket: {groupBy: $price, boundaries: [0, 100, 400]}} +// generates {$bucket: {groupBy: $price, boundaries: [0, 100, 400]}} bucket("price").withBoundaries(0, 100, 400); -// will generate {$bucket: {groupBy: $price, default: "Other" boundaries: [0, 100]}} +// generates {$bucket: {groupBy: $price, default: "Other" boundaries: [0, 100]}} bucket("price").withBoundaries(0, 100).withDefault("Other"); -// will generate {$bucket: {groupBy: $price, boundaries: [0, 100], output: { count: { $sum: 1}}}} +// generates {$bucket: {groupBy: $price, boundaries: [0, 100], output: { count: { $sum: 1}}}} bucket("price").withBoundaries(0, 100).andOutputCount().as("count"); -// will generate {$bucket: {groupBy: $price, boundaries: [0, 100], 5, output: { titles: { $push: "$title"}}} +// generates {$bucket: {groupBy: $price, boundaries: [0, 100], 5, output: { titles: { $push: "$title"}}} bucket("price").withBoundaries(0, 100).andOutput("title").push().as("titles"); ---- ==== -`BucketAutoOperation` determines boundaries itself in an attempt to evenly distribute documents into a specified number of buckets. `BucketAutoOperation` optionally takes a granularity specifies the https://en.wikipedia.org/wiki/Preferred_number[preferred number] series to use to ensure that the calculated boundary edges end on preferred round numbers or their powers of 10. +`BucketAutoOperation` determines boundaries in an attempt to evenly distribute documents into a specified number of buckets. `BucketAutoOperation` optionally takes a granularity value that specifies the https://en.wikipedia.org/wiki/Preferred_number[preferred number] series to use to ensure that the calculated boundary edges end on preferred round numbers or on powers of 10. The following listing shows examples of bucket operations: .Bucket operation examples ==== [source,java] ---- -// will generate {$bucketAuto: {groupBy: $price, buckets: 5}} +// generates {$bucketAuto: {groupBy: $price, buckets: 5}} bucketAuto("price", 5) -// will generate {$bucketAuto: {groupBy: $price, buckets: 5, granularity: "E24"}} +// generates {$bucketAuto: {groupBy: $price, buckets: 5, granularity: "E24"}} bucketAuto("price", 5).withGranularity(Granularities.E24).withDefault("Other"); -// will generate {$bucketAuto: {groupBy: $price, buckets: 5, output: { titles: { $push: "$title"}}} +// generates {$bucketAuto: {groupBy: $price, buckets: 5, output: { titles: { $push: "$title"}}} bucketAuto("price", 5).andOutput("title").push().as("titles"); ---- ==== -Bucket operations can use `AggregationExpression` via `andOutput()` and <> via `andOutputExpression()` to create output fields in buckets. +To create output fields in buckets, bucket operations can use `AggregationExpression` through `andOutput()` and <> through `andOutputExpression()`. Note that further details regarding bucket expressions can be found in the http://docs.mongodb.org/manual/reference/operator/aggregation/bucket/[`$bucket` section] and http://docs.mongodb.org/manual/reference/operator/aggregation/bucketAuto/[`$bucketAuto` section] of the MongoDB Aggregation Framework reference documentation. -==== Multi-faceted aggregation +==== Multi-faceted Aggregation -Multiple aggregation pipelines can be used to create multi-faceted aggregations which characterize data across multiple dimensions, or facets, within a single aggregation stage. Multi-faceted aggregations provide multiple filters and categorizations to guide data browsing and analysis. A common implementation of faceting is how many online retailers provide ways to narrow down search results by applying filters on product price, manufacturer, size, etc. +Multiple aggregation pipelines can be used to create multi-faceted aggregations that characterize data across multiple dimensions (or facets) within a single aggregation stage. Multi-faceted aggregations provide multiple filters and categorizations to guide data browsing and analysis. A common implementation of faceting is how many online retailers provide ways to narrow down search results by applying filters on product price, manufacturer, size, and other factors. -A `FacetOperation` can be defined via the `facet()` method of the `Aggregation` class. It can be customized with multiple aggregation pipelines via the `and()` method. Each sub-pipeline has its own field in the output document where its results are stored as an array of documents. +You can define a `FacetOperation` by using the `facet()` method of the `Aggregation` class. You can customize it with multiple aggregation pipelines by using the `and()` method. Each sub-pipeline has its own field in the output document where its results are stored as an array of documents. -Sub-pipelines can project and filter input documents prior grouping. Common cases are extraction of date parts or calculations before categorization. +Sub-pipelines can project and filter input documents prior to grouping. Common use cases include extraction of date parts or calculations before categorization. The following listing shows facet operation examples: .Facet operation examples ==== [source,java] ---- -// will generate {$facet: {categorizedByPrice: [ { $match: { price: {$exists : true}}}, { $bucketAuto: {groupBy: $price, buckets: 5}}]}} +// generates {$facet: {categorizedByPrice: [ { $match: { price: {$exists : true}}}, { $bucketAuto: {groupBy: $price, buckets: 5}}]}} facet(match(Criteria.where("price").exists(true)), bucketAuto("price", 5)).as("categorizedByPrice")) -// will generate {$facet: {categorizedByYear: [ -// { $project: { title: 1, publicationYear: { $year: "publicationDate"}}}, -// { $bucketAuto: {groupBy: $price, buckets: 5, output: { titles: {$push:"$title"}}} -// ]}} +// generates {$facet: {categorizedByCountry: [ { $match: { country: {$exists : true}}}, { $sortByCount: "$country"}]}} +facet(match(Criteria.where("country").exists(true)), sortByCount("country")).as("categorizedByCountry")) + +// generates {$facet: {categorizedByYear: [ +// { $project: { title: 1, publicationYear: { $year: "publicationDate"}}}, +// { $bucketAuto: {groupBy: $price, buckets: 5, output: { titles: {$push:"$title"}}} +// ]}} facet(project("title").and("publicationDate").extractYear().as("publicationYear"), bucketAuto("publicationYear", 5).andOutput("title").push().as("titles")) .as("categorizedByYear")) @@ -1969,18 +1965,18 @@ Note that further details regarding facet operation can be found in the http://d [[mongo.aggregation.projection.expressions]] ==== Spring Expression Support in Projection Expressions -We support the use of SpEL expression in projection expressions via the `andExpression` method of the `ProjectionOperation` and `BucketOperation` classes. This allows you to define the desired expression as a SpEL expression which is translated into a corresponding MongoDB projection expression part on query execution. This makes it much easier to express complex calculations. +We support the use of SpEL expressions in projection expressions through the `andExpression` method of the `ProjectionOperation` and `BucketOperation` classes. This feature lets you define the desired expression as a SpEL expression. On query execution, the SpEL expression is translated into a corresponding MongoDB projection expression part. This arrangement makes it much easier to express complex calculations. -===== Complex calculations with SpEL expressions +===== Complex Calculations with SpEL expressions -The following SpEL expression: +Consier the following SpEL expression: [source,java] ---- 1 + (q + 1) / (q - 1) ---- -will be translated into the following projection expression part: +The preceding expression is translated into the following projection expression part: [source,javascript] ---- @@ -1992,11 +1988,13 @@ will be translated into the following projection expression part: }]} ---- -Have a look at an example in more context in <> and <>. You can find more usage examples for supported SpEL expression constructs in `SpelExpressionTransformerUnitTests`. +You can see examples in more context in <> and <>. You can find more usage examples for supported SpEL expression constructs in `SpelExpressionTransformerUnitTests`. The following table shows the SpEL transformations supported by Spring Data MongoDB: .Supported SpEL transformations -[cols="2"] +[%header,cols="2"] |=== +| SpEL Expression +| Mongo Expression Part | a == b | { $eq : [$a, $b] } | a != b @@ -2029,7 +2027,7 @@ Have a look at an example in more context in < results = mongoTemplate.aggregate(agg, "tags", TagC List tagCount = results.getMappedResults(); ---- -* In order to do this we first create a new aggregation via the `newAggregation` static factory method to which we pass a list of aggregation operations. These aggregate operations define the aggregation pipeline of our `Aggregation`. -* As a second step we select the `"tags"` field (which is an array of strings) from the input collection with the `project` operation. -* In a third step we use the `unwind` operation to generate a new document for each tag within the `"tags"` array. -* In the forth step we use the `group` operation to define a group for each `"tags"`-value for which we aggregate the occurrence count via the `count` aggregation operator and collect the result in a new field called `"n"`. -* As a fifth step we select the field `"n"` and create an alias for the id-field generated from the previous group operation (hence the call to `previousOperation()`) with the name `"tag"`. -* As the sixth step we sort the resulting list of tags by their occurrence count in descending order via the `sort` operation. -* Finally we call the `aggregate` Method on the MongoTemplate in order to let MongoDB perform the actual aggregation operation with the created `Aggregation` as an argument. +The preceding listing uses the following algorithm: -Note that the input collection is explicitly specified as the `"tags"` parameter to the `aggregate` Method. If the name of the input collection is not specified explicitly, it is derived from the input-class passed as first parameter to the `newAggreation` Method. +. Create a new aggregation by using the `newAggregation` static factory method, to which we pass a list of aggregation operations. These aggregate operations define the aggregation pipeline of our `Aggregation`. +. Use the `project` operation to select the `tags` field (which is an array of strings) from the input collection. +. Use the `unwind` operation to generate a new document for each tag within the `tags` array. +. Use the `group` operation to define a group for each `tags` value for which we aggregate the occurrence count (by using the `count` aggregation operator and collecting the result in a new field called `n`). +. Select the `n` field and create an alias for the ID field generated from the previous group operation (hence the call to `previousOperation()`) with a name of `tag`. +. Use the `sort` operation to sort the resulting list of tags by their occurrence count in descending order. +. Call the `aggregate` method on `MongoTemplate` to let MongoDB perform the actual aggregation operation, with the created `Aggregation` as an argument. + +Note that the input collection is explicitly specified as the `tags` parameter to the `aggregate` Method. If the name of the input collection is not specified explicitly, it is derived from the input class passed as the first parameter to the `newAggreation` method. [[mongo.aggregation.examples.example2]] -.Aggregation Framework Example 2 +===== Aggregation Framework Example 2 -This example is based on the http://docs.mongodb.org/manual/tutorial/aggregation-examples/#largest-and-smallest-cities-by-state[Largest and Smallest Cities by State] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return the smallest and largest cities by population for each state, using the aggregation framework. This example demonstrates the usage of grouping, sorting and projections (selection). +This example is based on the http://docs.mongodb.org/manual/tutorial/aggregation-examples/#largest-and-smallest-cities-by-state[Largest and Smallest Cities by State] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return the smallest and largest cities by population for each state by using the aggregation framework. This example demonstrates grouping, sorting, and projections (selection). [source,java] ---- @@ -2135,19 +2135,22 @@ AggregationResults result = mongoTemplate.aggregate(aggregation, Z ZipInfoStats firstZipInfoStats = result.getMappedResults().get(0); ---- -* The class `ZipInfo` maps the structure of the given input-collection. The class `ZipInfoStats` defines the structure in the desired output format. -* As a first step we use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the fields `"state"` and `"city"` which forms the id structure of the group. We aggregate the value of the `"population"` property from the grouped elements with by using the `sum` operator saving the result in the field `"pop"`. -* In a second step we use the `sort` operation to sort the intermediate-result by the fields `"pop"`, `"state"` and `"city"` in ascending order, such that the smallest city is at the top and the biggest city is at the bottom of the result. Note that the sorting on `"state"` and `"city"` is implicitly performed against the group id fields which Spring Data MongoDB took care of. -* In the third step we use a `group` operation again to group the intermediate result by `"state"`. Note that `"state"` again implicitly references an group-id field. We select the name and the population count of the biggest and smallest city with calls to the `last(…)` and `first(...)` operator respectively via the `project` operation. -* As the forth step we select the `"state"` field from the previous `group` operation. Note that `"state"` again implicitly references an group-id field. As we do not want an implicitly generated id to appear, we exclude the id from the previous operation via `and(previousOperation()).exclude()`. As we want to populate the nested `City` structures in our output-class accordingly we have to emit appropriate sub-documents with the nested method. -* Finally as the fifth step we sort the resulting list of `StateStats` by their state name in ascending order via the `sort` operation. +Note that the `ZipInfo` class maps the structure of the given input-collection. The `ZipInfoStats` class defines the structure in the desired output format. -Note that we derive the name of the input-collection from the `ZipInfo`-class passed as first parameter to the `newAggregation`-Method. +The preceding listings use the following algorithm: + +. Use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the `state` and `city` fields, which forms the ID structure of the group. We aggregate the value of the `population` property from the grouped elements by using the `sum` operator and save the result in the `pop` field. +. Use the `sort` operation to sort the intermediate-result by the `pop`, `state` and `city` fields, in ascending order, such that the smallest city is at the top and the biggest city is at the bottom of the result. Note that the sorting on `state` and `city` is implicitly performed against the group ID fields (which Spring Data MongoDB handled). +. Use a `group` operation again to group the intermediate result by `state`. Note that `state` again implicitly references a group ID field. We select the name and the population count of the biggest and smallest city with calls to the `last(…)` and `first(...)` operators, respectively, in the `project` operation. +. Select the `state` field from the previous `group` operation. Note that `state` again implicitly references a group ID field. Because we do not want an implicitly generated ID to appear, we exclude the ID from the previous operation by using `and(previousOperation()).exclude()`. Because we want to populate the nested `City` structures in our output class, we have to emit appropriate sub-documents by using the nested method. +. Sort the resulting list of `StateStats` by their state name in ascending order in the `sort` operation. + +Note that we derive the name of the input collection from the `ZipInfo` class passed as the first parameter to the `newAggregation` method. [[mongo.aggregation.examples.example3]] -.Aggregation Framework Example 3 +===== Aggregation Framework Example 3 -This example is based on the http://docs.mongodb.org/manual/tutorial/aggregation-examples/#states-with-populations-over-10-million[States with Populations Over 10 Million ]example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return all states with a population greater than 10 million, using the aggregation framework. This example demonstrates the usage of grouping, sorting and matching (filtering). +This example is based on the http://docs.mongodb.org/manual/tutorial/aggregation-examples/#states-with-populations-over-10-million[States with Populations Over 10 Million] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return all states with a population greater than 10 million, using the aggregation framework. This example demonstrates grouping, sorting, and matching (filtering). [source,java] ---- @@ -2172,14 +2175,16 @@ AggregationResults result = mongoTemplate.aggregate(agg, StateStats. List stateStatsList = result.getMappedResults(); ---- -* As a first step we group the input collection by the `"state"` field and calculate the sum of the `"population"` field and store the result in the new field `"totalPop"`. -* In the second step we sort the intermediate result by the id-reference of the previous group operation in addition to the `"totalPop"` field in ascending order. -* Finally in the third step we filter the intermediate result by using a `match` operation which accepts a `Criteria` query as an argument. +The preceding listings use the following algorithm: -Note that we derive the name of the input-collection from the `ZipInfo`-class passed as first parameter to the `newAggregation`-Method. +. Group the input collection by the `state` field and calculate the sum of the `population` field and store the result in the new field `"totalPop"`. +. Sort the intermediate result by the id-reference of the previous group operation in addition to the `"totalPop"` field in ascending order. +. Filter the intermediate result by using a `match` operation which accepts a `Criteria` query as an argument. + +Note that we derive the name of the input collection from the `ZipInfo` class passed as first parameter to the `newAggregation` method. [[mongo.aggregation.examples.example4]] -.Aggregation Framework Example 4 +===== Aggregation Framework Example 4 This example demonstrates the use of simple arithmetic operations in the projection operation. @@ -2210,10 +2215,10 @@ AggregationResults result = mongoTemplate.aggregate(agg, Document.clas List resultList = result.getMappedResults(); ---- -Note that we derive the name of the input-collection from the `Product`-class passed as first parameter to the `newAggregation`-Method. +Note that we derive the name of the input collection from the `Product` class passed as first parameter to the `newAggregation` method. [[mongo.aggregation.examples.example5]] -.Aggregation Framework Example 5 +===== Aggregation Framework Example 5 This example demonstrates the use of simple arithmetic operations derived from SpEL Expressions in the projection operation. @@ -2247,11 +2252,11 @@ List resultList = result.getMappedResults(); ---- [[mongo.aggregation.examples.example6]] -.Aggregation Framework Example 6 +===== Aggregation Framework Example 6 This example demonstrates the use of complex arithmetic operations derived from SpEL Expressions in the projection operation. -Note: The additional parameters passed to the `addExpression` Method can be referenced via indexer expressions according to their position. In this example we reference the parameter which is the first parameter of the parameters array via `[0]`. External parameter expressions are replaced with their respective values when the SpEL expression is transformed into a MongoDB aggregation framework expression. +Note: The additional parameters passed to the `addExpression` method can be referenced with indexer expressions according to their position. In this example, we reference the first parameter of the parameters array with `[0]`. When the SpEL expression is transformed into a MongoDB aggregation framework expression, external parameter expressions are replaced with their respective values. [source,java] ---- @@ -2281,9 +2286,9 @@ List resultList = result.getMappedResults(); Note that we can also refer to other fields of the document within the SpEL expression. [[mongo.aggregation.examples.example7]] -.Aggregation Framework Example 7 +===== Aggregation Framework Example 7 -This example uses conditional projection. It's derived from the https://docs.mongodb.com/manual/reference/operator/aggregation/cond/[$cond reference documentation]. +This example uses conditional projection. It is derived from the https://docs.mongodb.com/manual/reference/operator/aggregation/cond/[$cond reference documentation]. [source,java] ---- @@ -2321,22 +2326,22 @@ AggregationResults result = mongoTemplate.aggregate(agg List stateStatsList = result.getMappedResults(); ---- -* This one-step aggregation uses a projection operation with the `inventory` collection. We project the `discount` field using a conditional operation for all inventory items that have a `qty` greater or equal to `250`. A second conditional projection is performed for the `description` field. We apply the description `Unspecified` to all items that either do not have a `description` field of items that have a `null` description. +This one-step aggregation uses a projection operation with the `inventory` collection. We project the `discount` field by using a conditional operation for all inventory items that have a `qty` greater than or equal to `250`. A second conditional projection is performed for the `description` field. We apply the `Unspecified` description to all items that either do not have a `description` field or items that have a `null` description. [[mongo.custom-converters]] -== Overriding default mapping with custom converters +== Overriding Default Mapping with Custom Converters -In order to have more fine-grained control over the mapping process you can register Spring converters with the `MongoConverter` implementations such as the `MappingMongoConverter`. +To have more fine-grained control over the mapping process, you can register Spring converters with the `MongoConverter` implementations, such as the `MappingMongoConverter`. -The `MappingMongoConverter` checks to see if there are any Spring converters that can handle a specific class before attempting to map the object itself. To 'hijack' the normal mapping strategies of the `MappingMongoConverter`, perhaps for increased performance or other custom mapping needs, you first need to create an implementation of the Spring `Converter` interface and then register it with the MappingConverter. +The `MappingMongoConverter` checks to see if any Spring converters can handle a specific class before attempting to map the object itself. To 'hijack' the normal mapping strategies of the `MappingMongoConverter`, perhaps for increased performance or other custom mapping needs, you first need to create an implementation of the Spring `Converter` interface and then register it with the `MappingConverter`. -NOTE: For more information on the Spring type conversion service see the reference docs http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#validation[here]. +NOTE: For more information on the Spring type conversion service, see the reference docs http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/core.html#validation[here]. [[mongo.custom-converters.writer]] -=== Saving using a registered Spring Converter +=== Saving by Using a Registered Spring Converter -An example implementation of the `Converter` that converts from a Person object to a `org.bson.Document` is shown below +The following example shows an implementation of the `Converter` that converts from a `Person` object to a `org.bson.Document`: [source,java] ---- @@ -2357,9 +2362,9 @@ public class PersonWriteConverter implements Converter { ---- [[mongo.custom-converters.reader]] -=== Reading using a Spring Converter +=== Reading by Using a Spring Converter -An example implementation of a Converter that converts from a Document to a Person object is shown below. +The following example shows an implementation of a `Converter` that converts from a `Document` to a `Person` object: [source,java] ---- @@ -2374,9 +2379,9 @@ public class PersonReadConverter implements Converter { ---- [[mongo.custom-converters.xml]] -=== Registering Spring Converters with the MongoConverter +=== Registering Spring Converters with the `MongoConverter` -The Mongo Spring namespace provides a convenience way to register Spring `Converter` s with the `MappingMongoConverter`. The configuration snippet below shows how to manually register converter beans as well as configuring the wrapping `MappingMongoConverter` into a `MongoTemplate`. +The Mongo Spring namespace provides a convenient way to register Spring `Converter` instances with the `MappingMongoConverter`. The following configuration snippet shows how to manually register converter beans as well as configure the wrapping `MappingMongoConverter` into a `MongoTemplate`: [source,xml] ---- @@ -2399,7 +2404,7 @@ The Mongo Spring namespace provides a convenience way to register Spring `Conver ---- -You can also use the base-package attribute of the custom-converters element to enable classpath scanning for all `Converter` and `GenericConverter` implementations below the given package. +You can also use the `base-package` attribute of the `custom-converters` element to enable classpath scanning for all `Converter` and `GenericConverter` implementations below the given package, as the following example shows: [source,xml] ---- @@ -2409,9 +2414,9 @@ You can also use the base-package attribute of the custom-converters element to ---- [[mongo.converter-disambiguation]] -=== Converter disambiguation +=== Converter Disambiguation -Generally we inspect the `Converter` implementations for the source and target types they convert from and to. Depending on whether one of those is a type MongoDB can handle natively we will register the converter instance as reading or writing one. Have a look at the following samples: +Generally, we inspect the `Converter` implementations for the source and target types they convert from and to. Depending on whether one of those is a type MongoDB can handle natively, we register the converter instance as a reading or a writing converter. The following examples show a writer converter and a read converter (note the difference is in the order of the qualifiers on `Converter`): [source,java] ---- @@ -2422,14 +2427,14 @@ class MyConverter implements Converter { … } class MyConverter implements Converter { … } ---- -In case you write a `Converter` whose source and target type are native Mongo types there's no way for us to determine whether we should consider it as reading or writing converter. Registering the converter instance as both might lead to unwanted results then. E.g. a `Converter` is ambiguous although it probably does not make sense to try to convert all `String` instances into `Long` instances when writing. To be generally able to force the infrastructure to register a converter for one way only we provide `@ReadingConverter` as well as `@WritingConverter` to be used in the converter implementation. +If you write a `Converter` whose source and target type are native Mongo types, we cannot determine whether we should consider it as a reading or a writing converter. Registering the converter instance as both might lead to unwanted results. For example, a `Converter` is ambiguous, although it probably does not make sense to try to convert all `String` instances into `Long` instances when writing. To let you force the infrastructure to register a converter for only one way, we provide `@ReadingConverter` and `@WritingConverter` annotations to be used in the converter implementation. [[mongo-template.index-and-collections]] -== Index and Collection management +== Index and Collection Management -`MongoTemplate` provides a few methods for managing indexes and collections. These are collected into a helper interface called `IndexOperations`. You access these operations by calling the method `indexOps` and pass in either the collection name or the `java.lang.Class` of your entity (the collection name will be derived from the .class either by name or via annotation metadata). +`MongoTemplate` provides a few methods for managing indexes and collections. These methods are collected into a helper interface called `IndexOperations`. You can access these operations by calling the `indexOps` method and passing in either the collection name or the `java.lang.Class` of your entity (the collection name is derived from the `.class`, either by name or from annotation metadata). -The `IndexOperations` interface is shown below +The following listing shows the `IndexOperations` interface: [source,java] ---- @@ -2448,20 +2453,18 @@ public interface IndexOperations { ---- [[mongo-template.index-and-collections.index]] -=== Methods for creating an Index +=== Methods for Creating an Index -We can create an index on a collection to improve query performance. - -==== Creating an index using the MongoTemplate +You can create an index on a collection to improve query performance by using the MongoTemplate class, as the following example shows: [source,java] ---- mongoTemplate.indexOps(Person.class).ensureIndex(new Index().on("name",Order.ASCENDING)); ---- -* *ensureIndex* Ensure that an index for the provided IndexDefinition exists for the collection. +`ensureIndex` makes sure that an index for the provided IndexDefinition exists for the collection. -You can create standard, geospatial and text indexes using the classes `IndexDefinition`, `GeoSpatialIndex` and `TextIndexDefinition`. For example, given the Venue class defined in a previous section, you would declare a geospatial query as shown below. +You can create standard, geospatial, and text indexes by using the `IndexDefinition`, `GeoSpatialIndex` and `TextIndexDefinition` classes. For example, given the `Venue` class defined in a previous section, you could declare a geospatial query, as the following example shows: [source,java] ---- @@ -2471,9 +2474,9 @@ mongoTemplate.indexOps(Venue.class).ensureIndex(new GeospatialIndex("location")) NOTE: `Index` and `GeospatialIndex` support configuration of <>. [[mongo-template.index-and-collections.access]] -=== Accessing index information +=== Accessing Index Information -The IndexOperations interface has the method getIndexInfo that returns a list of IndexInfo objects. This contains all the indexes defined on the collection. Here is an example that defines an index on the Person class that has age property. +The `IndexOperations` interface has the `getIndexInfo` method that returns a list of `IndexInfo` objects. This list contains all the indexes defined on the collection. The following example defines an index on the `Person` class that has an `age` property: [source,java] ---- @@ -2487,11 +2490,11 @@ List indexInfoList = template.indexOps(Person.class).getIndexInfo(); ---- [[mongo-template.index-and-collections.collection]] -=== Methods for working with a Collection +=== Methods for Working with a Collection -It's time to look at some code examples showing how to use the `MongoTemplate`. First we look at creating our first collection. +The following example shows how to create a collection: -.Working with collections using the MongoTemplate +.Working with collections by using `MongoTemplate` ==== [source,java] ---- @@ -2504,32 +2507,32 @@ mongoTemplate.dropCollection("MyNewCollection"); ---- ==== -* *getCollectionNames* Returns a set of collection names. -* *collectionExists* Check to see if a collection with a given name exists. -* *createCollection* Create an uncapped collection -* *dropCollection* Drop the collection -* *getCollection* Get a collection by name, creating it if it doesn't exist. +* *getCollectionNames*: Returns a set of collection names. +* *collectionExists*: Checks to see if a collection with a given name exists. +* *createCollection*: Creates an uncapped collection +* *dropCollection*: Drops the collection +* *getCollection*: Gets a collection by name, creating it if it does not exist. -NOTE: Collection creation allows customization via `CollectionOptions` and supports <>. +NOTE: Collection creation allows customization with `CollectionOptions` and supports <>. [[mongo-template.commands]] == Executing Commands -You can also get at the MongoDB driver's `MongoDatabase.runCommand( )` method using the `executeCommand(…)` methods on `MongoTemplate`. These will also perform exception translation into Spring's `DataAccessException` hierarchy. +You can get at the MongoDB driver's `MongoDatabase.runCommand( )` method by using the `executeCommand(…)` methods on `MongoTemplate`. These methods also perform exception translation into Spring's `DataAccessException` hierarchy. [[mongo-template.commands.execution]] === Methods for executing commands -* `Document` *executeCommand* `(Document command)` Execute a MongoDB command. -* `Document` *executeCommand* `(Document command, ReadPreference readPreference)` Execute a MongoDB command using the given nullable MongoDB `ReadPreference`. -* `Document` *executeCommand* `(String jsonCommand)` Execute the a MongoDB command expressed as a JSON string. +* `Document` *executeCommand* `(Document command)`: Run a MongoDB command. +* `Document` *executeCommand* `(Document command, ReadPreference readPreference)`: Run a MongoDB command with the given nullable MongoDB `ReadPreference`. +* `Document` *executeCommand* `(String jsonCommand)`: Execute a MongoDB command expressed as a JSON string. [[mongodb.mapping-usage.events]] == Lifecycle Events -Built into the MongoDB mapping framework are several `org.springframework.context.ApplicationEvent` events that your application can respond to by registering special beans in the `ApplicationContext`. By being based off Spring's ApplicationContext event infrastructure this enables other products, such as Spring Integration, to easily receive these events as they are a well known eventing mechanism in Spring based applications. +The MongoDB mapping framework includes several `org.springframework.context.ApplicationEvent` events that your application can respond to by registering special beans in the `ApplicationContext`. Being based off of Spring's `ApplicationContext` event infrastructure enables other products, such as Spring Integration, to easily receive these events, as they are a well known eventing mechanism in Spring-based applications. -To intercept an object before it goes through the conversion process (which turns your domain object into a `org.bson.Document`), you'd register a subclass of `AbstractMongoEventListener` that overrides the `onBeforeConvert` method. When the event is dispatched, your listener will be called and passed the domain object before it goes into the converter. +To intercept an object before it goes through the conversion process (which turns your domain object into a `org.bson.Document`), you can register a subclass of `AbstractMongoEventListener` that overrides the `onBeforeConvert` method. When the event is dispatched, your listener is called and passed the domain object before it goes into the converter. The following example shows how to do so: ==== [source,java] @@ -2543,7 +2546,7 @@ public class BeforeConvertListener extends AbstractMongoEventListener { ---- ==== -To intercept an object before it goes into the database, you'd register a subclass of `org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener` that overrides the `onBeforeSave` method. When the event is dispatched, your listener will be called and passed the domain object and the converted `com.mongodb.Document`. +To intercept an object before it goes into the database, you can register a subclass of `org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener` that overrides the `onBeforeSave` method. When the event is dispatched, your listener is called and passed the domain object and the converted `com.mongodb.Document`. The following example shows how to do so: ==== [source,java] @@ -2557,45 +2560,45 @@ public class BeforeSaveListener extends AbstractMongoEventListener { ---- ==== -Simply declaring these beans in your Spring ApplicationContext will cause them to be invoked whenever the event is dispatched. +Declaring these beans in your Spring ApplicationContext causes them to be invoked whenever the event is dispatched. -The list of callback methods that are present in AbstractMappingEventListener are +The following callback methods are present in `AbstractMappingEventListener`: -* `onBeforeConvert` - called in MongoTemplate insert, insertList and save operations before the object is converted to a Document using a MongoConveter. -* `onBeforeSave` - called in MongoTemplate insert, insertList and save operations *before* inserting/saving the Document in the database. -* `onAfterSave` - called in MongoTemplate insert, insertList and save operations *after* inserting/saving the Document in the database. -* `onAfterLoad` - called in MongoTemplate find, findAndRemove, findOne and getCollection methods after the Document is retrieved from the database. -* `onAfterConvert` - called in MongoTemplate find, findAndRemove, findOne and getCollection methods after the Document retrieved from the database was converted to a POJO. +* `onBeforeConvert`: Called in `MongoTemplate` `insert`, `insertList`, and `save` operations before the object is converted to a `Document` by a `MongoConverter`. +* `onBeforeSave`: Called in `MongoTemplate` `insert`, `insertList`, and `save` operations *before* inserting or saving the `Document` in the database. +* `onAfterSave`: Called in `MongoTemplate` `insert`, `insertList`, and `save` operations *after* inserting or saving the `Document` in the database. +* `onAfterLoad`: Called in `MongoTemplate` `find`, `findAndRemove`, `findOne`, and `getCollection` methods after the `Document` has been retrieved from the database. +* `onAfterConvert`: Called in `MongoTemplate` `find`, `findAndRemove`, `findOne`, and `getCollection` methods after the `Document` has been retrieved from the database was converted to a POJO. -NOTE: Lifecycle events are only emitted for root level types. Complex types used as properties within a document root are not subject of event publication unless they are document references annotated with `@DBRef`. +NOTE: Lifecycle events are only emitted for root level types. Complex types used as properties within a document root are not subject to event publication unless they are document references annotated with `@DBRef`. [[mongo.exception]] == Exception Translation The Spring framework provides exception translation for a wide variety of database and mapping technologies. This has traditionally been for JDBC and JPA. The Spring support for MongoDB extends this feature to the MongoDB Database by providing an implementation of the `org.springframework.dao.support.PersistenceExceptionTranslator` interface. -The motivation behind mapping to Spring's http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html#dao-exceptions[consistent data access exception hierarchy] is that you are then able to write portable and descriptive exception handling code without resorting to coding against MongoDB error codes. All of Spring's data access exceptions are inherited from the root `DataAccessException` class so you can be sure that you will be able to catch all database related exception within a single try-catch block. Note, that not all exceptions thrown by the MongoDB driver inherit from the MongoException class. The inner exception and message are preserved so no information is lost. +The motivation behind mapping to Spring's http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html#dao-exceptions[consistent data access exception hierarchy] is that you are then able to write portable and descriptive exception handling code without resorting to coding against MongoDB error codes. All of Spring's data access exceptions are inherited from the root `DataAccessException` class so that you can be sure to catch all database related exception within a single try-catch block. Note that not all exceptions thrown by the MongoDB driver inherit from the `MongoException` class. The inner exception and message are preserved so that no information is lost. -Some of the mappings performed by the `MongoExceptionTranslator` are: com.mongodb.Network to DataAccessResourceFailureException and `MongoException` error codes 1003, 12001, 12010, 12011, 12012 to `InvalidDataAccessApiUsageException`. Look into the implementation for more details on the mapping. +Some of the mappings performed by the `MongoExceptionTranslator` are `com.mongodb.Network to DataAccessResourceFailureException` and `MongoException` error codes 1003, 12001, 12010, 12011, and 12012 to `InvalidDataAccessApiUsageException`. Look into the implementation for more details on the mapping. [[mongo.executioncallback]] -== Execution callbacks +== Execution Callbacks -One common design feature of all Spring template classes is that all functionality is routed into one of the templates execute callback methods. This helps ensure that exceptions and any resource management that maybe required are performed consistency. While this was of much greater need in the case of JDBC and JMS than with MongoDB, it still offers a single spot for exception translation and logging to occur. As such, using these execute callback is the preferred way to access the MongoDB driver's `DB` and `DBCollection` objects to perform uncommon operations that were not exposed as methods on `MongoTemplate`. +One common design feature of all Spring template classes is that all functionality is routed into one of the template's execute callback methods. Doing so helps to ensure that exceptions and any resource management that may be required are performed consistently. While JDBC and JMS need this feature much more than MongoDB does, it still offers a single spot for exception translation and logging to occur. Consequently, using these execute callbacks is the preferred way to access the MongoDB driver's `DB` and `DBCollection` objects to perform uncommon operations that were not exposed as methods on `MongoTemplate`. -Here is a list of execute callback methods. +The following list describes the execute callback methods. -* ` T` *execute* `(Class entityClass, CollectionCallback action)` Executes the given CollectionCallback for the entity collection of the specified class. +* ` T` *execute* `(Class entityClass, CollectionCallback action)`: Executes the given `CollectionCallback` for the entity collection of the specified class. -* ` T` *execute* `(String collectionName, CollectionCallback action)` Executes the given CollectionCallback on the collection of the given name. +* ` T` *execute* `(String collectionName, CollectionCallback action)`: Executes the given `CollectionCallback` on the collection of the given name. -* ` T` *execute* `(DbCallback action) Spring Data MongoDB provides support for the Aggregation Framework introduced to MongoDB in version 2.2.` Executes a DbCallback translating any exceptions as necessary. +* ` T` *execute* `(DbCallback action)`: Executes a DbCallback translating any exceptions as necessary. Spring Data MongoDB provides support for the Aggregation Framework introduced to MongoDB in version 2.2. -* ` T` *execute* `(String collectionName, DbCallback action)` Executes a DbCallback on the collection of the given name translating any exceptions as necessary. +* ` T` *execute* `(String collectionName, DbCallback action)`: Executes a `DbCallback` on the collection of the given name translating any exceptions as necessary. -* ` T` *executeInSession* `(DbCallback action)` Executes the given DbCallback within the same connection to the database so as to ensure consistency in a write heavy environment where you may read the data that you wrote. +* ` T` *executeInSession* `(DbCallback action)`: Executes the given `DbCallback` within the same connection to the database so as to ensure consistency in a write-heavy environment where you may read the data that you wrote. -Here is an example that uses the `CollectionCallback` to return information about an index +The following example uses the `CollectionCallback` to return information about an index: [source,java] ---- @@ -2613,9 +2616,9 @@ boolean hasIndex = template.execute("geolocation", new CollectionCallbackBoolean ---- [[gridfs]] -== GridFS support +== GridFS Support -MongoDB supports storing binary files inside it's filesystem GridFS. Spring Data MongoDB provides a `GridFsOperations` interface as well as the according implementation `GridFsTemplate` to easily interact with the filesystem. You can setup a `GridFsTemplate` instance by handing it a `MongoDbFactory` as well as a `MongoConverter`: +MongoDB supports storing binary files inside its filesystem, GridFS. Spring Data MongoDB provides a `GridFsOperations` interface as well as the corresponding implementation, `GridFsTemplate`, to let you interact with the filesystem. You can set up a `GridFsTemplate` instance by handing it a `MongoDbFactory` as well as a `MongoConverter`, as the following example shows: .JavaConfig setup for a GridFsTemplate ==== @@ -2633,7 +2636,7 @@ class GridFsConfiguration extends AbstractMongoConfiguration { ---- ==== -An according XML configuration looks like this: +The corresponding XML configuration follows: .XML configuration for a GridFsTemplate ==== @@ -2660,7 +2663,7 @@ An according XML configuration looks like this: ---- ==== -The template can now be injected and used to perform storage and retrieval operations. +The template can now be injected and used to perform storage and retrieval operations, as the following example shows: .Using GridFsTemplate to store files ==== @@ -2684,9 +2687,9 @@ class GridFsClient { ---- ==== -The `store(…)` operations take an `InputStream`, a filename and optionally metadata information about the file to store. The metadata can be an arbitrary object which will be marshaled by the `MongoConverter` configured with the `GridFsTemplate`. Alternatively you can also provide a `Document` as well. +The `store(…)` operations take an `InputStream`, a filename, and (optionally) metadata information about the file to store. The metadata can be an arbitrary object, which will be marshaled by the `MongoConverter` configured with the `GridFsTemplate`. Alternatively, you can also provide a `Document`. -Reading files from the filesystem can either be achieved through the `find(…)` or `getResources(…)` methods. Let's have a look at the `find(…)` methods first. You can either find a single file matching a `Query` or multiple ones. To easily define file queries we provide the `GridFsCriteria` helper class. It provides static factory methods to encapsulate default metadata fields (e.g. `whereFilename()`, `whereContentType()`) or the custom one through `whereMetaData()`. +You can read files from the filesystem through either the `find(…)` or the `getResources(…)` methods. Let's have a look at the `find(…)` methods first. You can either find a single file or multiple files that match a `Query`. You can use the `GridFsCriteria` helper class to define queries. It provides static factory methods to encapsulate default metadata fields (such as `whereFilename()` and `whereContentType()`) or a custom one through `whereMetaData()`. The following example shows how to use `GridFsTemplate` to query for files: .Using GridFsTemplate to query for files ==== @@ -2705,9 +2708,9 @@ class GridFsClient { ---- ==== -NOTE: Currently MongoDB does not support defining sort criteria when retrieving files from GridFS. Thus any sort criteria defined on the `Query` instance handed into the `find(…)` method will be disregarded. +NOTE: Currently, MongoDB does not support defining sort criteria when retrieving files from GridFS. For this reason, any sort criteria defined on the `Query` instance handed into the `find(…)` method are disregarded. -The other option to read files from the GridFs is using the methods introduced by the `ResourcePatternResolver` interface. They allow handing an Ant path into the method ar thus retrieve files matching the given pattern. +The other option to read files from the GridFs is to use the methods introduced by the `ResourcePatternResolver` interface. They allow handing an Ant path into the method and can thus retrieve files matching the given pattern. The following example shows how to use `GridFsTemplate` to read files: .Using GridFsTemplate to read files ==== @@ -2726,4 +2729,4 @@ class GridFsClient { ---- ==== -`GridFsOperations` extending `ResourcePatternResolver` allows the `GridFsTemplate` e.g. to be plugged into an `ApplicationContext` to read Spring Config files from a MongoDB. +`GridFsOperations` extends `ResourcePatternResolver` and lets the `GridFsTemplate` (for exmaple) to be plugged into an `ApplicationContext` to read Spring Config files from MongoDB database. diff --git a/src/main/asciidoc/reference/query-by-example.adoc b/src/main/asciidoc/reference/query-by-example.adoc index 283bd8592..59b8d2002 100644 --- a/src/main/asciidoc/reference/query-by-example.adoc +++ b/src/main/asciidoc/reference/query-by-example.adoc @@ -1,7 +1,9 @@ [[query-by-example.execution]] -== Executing an example +== Running an Example -.Query by Example using a Repository +The following example shows how to query by example when using a repository (of `Person` objects, in this case): + +.Query by Example using a repository ==== [source, java] ---- @@ -20,9 +22,9 @@ public class PersonService { ---- ==== -An `Example` containing an untyped `ExampleSpec` uses the Repository type and its collection name. Typed `ExampleSpec` use their type as result type and the collection name from the Repository. +An `Example` containing an untyped `ExampleSpec` uses the Repository type and its collection name. Typed `ExampleSpec` instances use their type as the result type and the collection name from the `Repository` instance. -NOTE: When including `null` values in the `ExampleSpec` Spring Data Mongo uses embedded document matching instead of dot notation property matching. This forces exact document matching for all property values and the property order in the embedded document. +NOTE: When including `null` values in the `ExampleSpec`, Spring Data Mongo uses embedded document matching instead of dot notation property matching. Doing so forces exact document matching for all property values and the property order in the embedded document. Spring Data MongoDB provides support for the following matching options: @@ -73,9 +75,9 @@ Spring Data MongoDB provides support for the following matching options: [[query-by-example.untyped]] == Untyped Example -By default `Example` is strictly typed. This means the mapped query will have a type match included restricting it to probe assignable types. Eg. when sticking with the default type key `_class` the query has restrictions like `_class : { $in : [ com.acme.Person] }`. +By default `Example` is strictly typed. This means that the mapped query has an included type match, restricting it to probe assignable types. For example, when sticking with the default type key (`_class`), the query has restrictions such as (`_class : { $in : [ com.acme.Person] }`). -By using the `UntypedExampleMatcher` it is possible bypasses the default behavior and skip the type restriction. So as long as field names match nearly any domain type can be used as the probe for creating the reference. +By using the `UntypedExampleMatcher`, it is possible to bypass the default behavior and skip the type restriction. So, as long as field names match, nearly any domain type can be used as the probe for creating the reference, as the following example shows: .Untyped Example Query ==== @@ -94,4 +96,4 @@ Example example = Example.of(probe, UntypedExampleMatcher.matching()); Query query = new Query(new Criteria().alike(example)); List result = template.find(query, Person.class); ---- -==== \ No newline at end of file +==== diff --git a/src/main/asciidoc/reference/reactive-mongo-repositories.adoc b/src/main/asciidoc/reference/reactive-mongo-repositories.adoc index 489191d23..8e3bf8f7c 100644 --- a/src/main/asciidoc/reference/reactive-mongo-repositories.adoc +++ b/src/main/asciidoc/reference/reactive-mongo-repositories.adoc @@ -1,19 +1,16 @@ [[mongo.reactive.repositories]] = Reactive MongoDB repositories -[[mongo.reactive.repositories.intro]] -== Introduction - -This chapter will point out the specialties for reactive repository support for MongoDB. This builds on the core repository support explained in <>. So make sure you've got a sound understanding of the basic concepts explained there. +This chapter describes the specialties for reactive repository support for MongoDB. This chapter builds on the core repository support explained in <>. You should have a sound understanding of the basic concepts explained there. [[mongo.reactive.repositories.libraries]] == Reactive Composition Libraries The reactive space offers various reactive composition libraries. The most common libraries are https://github.com/ReactiveX/RxJava[RxJava] and https://projectreactor.io/[Project Reactor]. -Spring Data MongoDB is built on top of the https://mongodb.github.io/mongo-java-driver-reactivestreams/[MongoDB Reactive Streams] driver to provide maximal interoperability relying on the http://www.reactive-streams.org/[Reactive Streams] initiative. Static APIs such as `ReactiveMongoOperations` are provided by using Project Reactor's `Flux` and `Mono` types. Project Reactor offers various adapters to convert reactive wrapper types (`Flux` to `Observable` and vice versa) but conversion can easily clutter your code. +Spring Data MongoDB is built on top of the https://mongodb.github.io/mongo-java-driver-reactivestreams/[MongoDB Reactive Streams] driver, to provide maximal interoperability by relying on the http://www.reactive-streams.org/[Reactive Streams] initiative. Static APIs, such as `ReactiveMongoOperations`, are provided by using Project Reactor's `Flux` and `Mono` types. Project Reactor offers various adapters to convert reactive wrapper types (`Flux` to `Observable` and vice versa), but conversion can easily clutter your code. -Spring Data's Repository abstraction is a dynamic API, mostly defined by you and your requirements, as you're declaring query methods. Reactive MongoDB repositories can be either implemented using RxJava or Project Reactor wrapper types by simply extending from one of the library-specific repository interfaces: +Spring Data's Repository abstraction is a dynamic API, mostly defined by you and your requirements as you declare query methods. Reactive MongoDB repositories can be implemented by using either RxJava or Project Reactor wrapper types by extending from one of the following library-specific repository interfaces: * `ReactiveCrudRepository` * `ReactiveSortingRepository` @@ -25,9 +22,9 @@ Spring Data converts reactive wrapper types behind the scenes so that you can st [[mongo.reactive.repositories.usage]] == Usage -To access domain entities stored in a MongoDB you can leverage our sophisticated repository support that eases implementing those quite significantly. To do so, simply create an interface for your repository: +To access domain entities stored in a MongoDB database, you can use our sophisticated repository support that eases implementing those quite significantly. To do so, create an interface similar for your repository. Before you can do that, though, you need an entity, such as the entity defined in the following example: -.Sample Person entity +.Sample `Person` entity ==== [source,java] ---- @@ -44,7 +41,7 @@ public class Person { ---- ==== -We have a quite simple domain object here. Note that it has a property named `id` of type `ObjectId`. The default serialization mechanism used in `MongoTemplate` (which is backing the repository support) regards properties named id as document id. Currently we support `String`, `ObjectId` and `BigInteger` as id-types. +Note that the entity defined in the preceding example has a property named `id` of type `ObjectId`. The default serialization mechanism used in `MongoTemplate` (which backs the repository support) regards properties named `id` as the document ID. Currently, we support `String`, `ObjectId`, and `BigInteger` as id-types. The following example shows how to create an interface that defines queries against the `Person` object from the preceding example: .Basic repository interface to persist Person entities ==== @@ -63,18 +60,20 @@ public interface ReactivePersonRepository extends ReactiveSortingRepository findFirstByLastname(String lastname); <5> } ---- -<1> The method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of `{"lastname" : lastname}`. -<2> The method shows a query for all people with the given firstname once the firstname is emitted via the given `Publisher`. -<3> Use `Pageable` to pass on offset and sorting parameters to the database. -<4> Find a single entity for given criteria. Completes with `IncorrectResultSizeDataAccessException` on non unique results. -<5> Unless <4> the first entity is always emitted even if the query yields more result documents. +<1> The method shows a query for all people with the given `lastname`. The query us derived by parsing the method name for constraints that can be concatenated with `And` and `Or`. Thus, the method name results in a query expression of `{"lastname" : lastname}`. +<2> The method shows a query for all people with the given `firstname` once the `firstname` is emitted by the given `Publisher`. +<3> Use `Pageable` to pass offset and sorting parameters to the database. +<4> Find a single entity for the given criteria. It completes with `IncorrectResultSizeDataAccessException` on non-unique results. +<5> Unless <4>, the first entity is always emitted even if the query yields more result documents. ==== -For JavaConfig use the `@EnableReactiveMongoRepositories` annotation. The annotation carries the very same attributes like the namespace element. If no base package is configured the infrastructure will scan the package of the annotated configuration class. +For Java configuration, use the `@EnableReactiveMongoRepositories` annotation. The annotation carries the same attributes as the namespace element. If no base package is configured, the infrastructure scans the package of the annotated configuration class. -NOTE: MongoDB uses two different drivers for blocking and reactive (non-blocking) data access. It's required to create a connection using the Reactive Streams driver to provide the required infrastructure for Spring Data's Reactive MongoDB support hence you're required to provide a separate Configuration for MongoDB's Reactive Streams driver. Please also note that your application will operate on two different connections if using Reactive and Blocking Spring Data MongoDB Templates and Repositories. +NOTE: MongoDB uses two different drivers for blocking and reactive (non-blocking) data access. You must create a connection by using the Reactive Streams driver to provide the required infrastructure for Spring Data's Reactive MongoDB support. Consequently, you must provide a separate configuration for MongoDB's Reactive Streams driver. Note that your application operates on two different connections if you use reactive and blocking Spring Data MongoDB templates and repositories. -.JavaConfig for repositories +The following listing shows how to use Java configuration for a repository: + +.Java configuration for repositories ==== [source,java] ---- @@ -100,7 +99,7 @@ class ApplicationConfig extends AbstractReactiveMongoConfiguration { ---- ==== -As our domain repository extends `ReactiveSortingRepository` it provides you with CRUD operations as well as methods for sorted access to the entities. Working with the repository instance is just a matter of dependency injecting it into a client. +Because our domain repository extends `ReactiveSortingRepository`, it provides you with CRUD operations as well as methods for sorted access to the entities. Working with the repository instance is a matter of dependency injecting it into a client, as the following example shows: .Sorted access to Person entities ==== @@ -123,7 +122,7 @@ public class PersonRepositoryTests { Spring Data's Reactive MongoDB support comes with a reduced feature set compared to the blocking <>. -Following features are supported: +It supports the following features: * Query Methods using <> * <> @@ -132,12 +131,14 @@ Following features are supported: * <> * <> -WARNING: Reactive Repositories do not support Type-safe Query methods using Querydsl. +WARNING: Reactive Repositories do not support type-safe query methods that use `Querydsl`. [[mongodb.reactive.repositories.queries.geo-spatial]] -=== Geo-spatial repository queries +=== Geo-spatial Repository Queries -As you've just seen there are a few keywords triggering geo-spatial operations within a MongoDB query. The `Near` keyword allows some further modification. Let's have look at some examples: +As you saw earlier in "`<>`", a few keywords trigger geo-spatial operations within a MongoDB query. The `Near` keyword allows some further modification, as the next few examples show. + +The following example shows how to define a `near` query that finds all persons with a given distance of a given point: .Advanced `Near` queries ==== @@ -151,9 +152,7 @@ public interface PersonRepository extends ReactiveMongoRepository` results within a reactive wrapper type. `GeoPage` and `GeoResults` are not supported as they contradict the deferred result approach with pre-calculating the average distance. Howevery, you can still pass in a `Pageable` argument to page results yourself. +Adding a `Distance` parameter to the query method allows restricting results to those within the given distance. If the `Distance` was set up containing a `Metric`, we transparently use `$nearSphere` instead of $code, as the following example shows: .Using `Distance` with `Metrics` ==== @@ -166,11 +165,17 @@ Distance distance = new Distance(200, Metrics.KILOMETERS); ---- ==== -As you can see using a `Distance` equipped with a `Metric` causes `$nearSphere` clause to be added instead of a plain `$near`. Beyond that the actual distance gets calculated according to the `Metrics` used. +NOTE: Reactive Geo-spatial repository queries support the domain type and `GeoResult` results within a reactive wrapper type. `GeoPage` and `GeoResults` are not supported as they contradict the deferred result approach with pre-calculating the average distance. Howevery, you can still pass in a `Pageable` argument to page results yourself. + +Using a `Distance` with a `Metric` causes a `$nearSphere` (instead of a plain `$near`) clause to be added. Beyond that, the actual distance gets calculated according to the `Metrics` used. + +(Note that `Metric` does not refer to metric units of measure. It could be miles rather than kilometers. Rather, `metric` refers to the concept of a system of measurement, regardless of which system you use.) NOTE: Using `@GeoSpatialIndexed(type = GeoSpatialIndexType.GEO_2DSPHERE)` on the target property forces usage of `$nearSphere` operator. -==== Geo-near queries +==== Geo-near Queries + +Spring Data MongoDb supports geo-near queries, as the following example shows: [source,java] ---- @@ -197,9 +202,9 @@ public interface PersonRepository extends ReactiveMongoRepository New -> Spring Template Project -> Simple Spring Utility Project -> press Yes when prompted. Then enter a project and a package name such as org.spring.mongodb.example. +To create a Spring project in STS, go to File -> New -> Spring Template Project -> Simple Spring Utility Project and press Yes when prompted. Then enter a project and a package name, such as org.spring.mongodb.example. -Then add the following to pom.xml dependencies section. +Then add the following to the pom.xml dependencies section. [source,xml] ---- @@ -52,9 +52,9 @@ Then add the following to pom.xml dependencies section. ---- -NOTE: MongoDB uses two different drivers for blocking and reactive (non-blocking) data access. While blocking operations are provided by default, you're have to opt-in for reactive usage. +NOTE: MongoDB uses two different drivers for blocking and reactive (non-blocking) data access. While blocking operations are provided by default, you can opt-in for reactive usage. -Create a simple `Person` class to persist: +To get started with a working example, create a simple `Person` class to persist, as follows: [source,java] ---- @@ -87,7 +87,7 @@ public class Person { } ---- -And a main application to run +Then create an application to run, as follows: [source,java] ---- @@ -113,7 +113,7 @@ public class ReactiveMongoApp { } ---- -This will produce the following output +Running the preceding class produces the following output: [source] ---- @@ -125,25 +125,25 @@ This will produce the following output 2016-09-20 14:56:57,573 DEBUG .data.mongodb.core.ReactiveMongoTemplate: 528 - Dropped collection [person] ---- -Even in this simple example, there are few things to take notice of +Even in this simple example, there are a few things to take notice of: -* You can instantiate the central helper class of Spring Mongo, <>, using the standard `com.mongodb.reactivestreams.client.MongoClient` object and the name of the database to use. +* You can instantiate the central helper class of Spring Mongo (<>) by using the standard `com.mongodb.reactivestreams.client.MongoClient` object and the name of the database to use. * The mapper works against standard POJO objects without the need for any additional metadata (though you can optionally provide that information. See <>.). -* Conventions are used for handling the id field, converting it to be a ObjectId when stored in the database. -* Mapping conventions can use field access. Notice the Person class has only getters. -* If the constructor argument names match the field names of the stored document, they will be used to instantiate the object +* Conventions are used for handling the ID field, converting it to be an `ObjectId` when stored in the database. +* Mapping conventions can use field access. Notice that the `Person` class has only getters. +* If the constructor argument names match the field names of the stored document, they are used to instantiate the object -There is an https://github.com/spring-projects/spring-data-examples[github repository with several examples] that you can download and play around with to get a feel for how the library works. +There is a https://github.com/spring-projects/spring-data-examples[github repository with several examples] that you can download and play around with to get a feel for how the library works. [[mongo.reactive.driver]] == Connecting to MongoDB with Spring and the Reactive Streams Driver -One of the first tasks when using MongoDB and Spring is to create a `com.mongodb.reactivestreams.client.MongoClient` object using the IoC container. +One of the first tasks when using MongoDB and Spring is to create a `com.mongodb.reactivestreams.client.MongoClient` object by using the IoC container. [[mongo.reactive.mongo-java-config]] -=== Registering a MongoClient instance using Java based metadata +=== Registering a MongoClient Instance Using Java-based Metadata -An example of using Java based bean metadata to register an instance of a `com.mongodb.reactivestreams.client.MongoClient` is shown below +The following example shows how to use Java-based bean metadata to register an instance of a `com.mongodb.reactivestreams.client.MongoClient`: .Registering a com.mongodb.MongoClient object using Java based bean metadata ==== @@ -162,11 +162,11 @@ public class AppConfig { ---- ==== -This approach allows you to use the standard `com.mongodb.reactivestreams.client.MongoClient` API that you may already be used to using. +This approach lets you use the standard `com.mongodb.reactivestreams.client.MongoClient` API (which you may already know). -An alternative is to register an instance of `com.mongodb.reactivestreams.client.MongoClient` instance with the container using Spring's `ReactiveMongoClientFactoryBean`. As compared to instantiating a `com.mongodb.reactivestreams.client.MongoClient` instance directly, the FactoryBean approach has the added advantage of also providing the container with an `ExceptionTranslator` implementation that translates MongoDB exceptions to exceptions in Spring's portable `DataAccessException` hierarchy for data access classes annotated with the `@Repository` annotation. This hierarchy and use of `@Repository` is described in http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html[Spring's DAO support features]. +An alternative is to register an instance of `com.mongodb.reactivestreams.client.MongoClient` instance with the container by using Spring's `ReactiveMongoClientFactoryBean`. As compared to instantiating a `com.mongodb.reactivestreams.client.MongoClient` instance directly, the `FactoryBean` approach has the added advantage of also providing the container with an `ExceptionTranslator` implementation that translates MongoDB exceptions to exceptions in Spring's portable `DataAccessException` hierarchy for data access classes annotated with the `@Repository` annotation. This hierarchy and use of `@Repository` is described in http://docs.spring.io/spring/docs/{springVersion}/spring-framework-reference/data-access.html[Spring's DAO support features]. -An example of a Java based bean metadata that supports exception translation on `@Repository` annotated classes is shown below: +The following example shows Java-based bean metadata that supports exception translation on `@Repository` annotated classes: .Registering a com.mongodb.MongoClient object using Spring's MongoClientFactoryBean and enabling Spring's exception translation support ==== @@ -189,13 +189,13 @@ public class AppConfig { ---- ==== -To access the `com.mongodb.reactivestreams.client.MongoClient` object created by the `ReactiveMongoClientFactoryBean` in other `@Configuration` or your own classes, just obtain the `MongoClient` from the context. +To access the `com.mongodb.reactivestreams.client.MongoClient` object created by the `ReactiveMongoClientFactoryBean` in other `@Configuration` or your own classes, get the `MongoClient` from the context. [[mongo.reactive.mongo-db-factory]] -=== The ReactiveMongoDatabaseFactory interface +=== The ReactiveMongoDatabaseFactory Interface -While `com.mongodb.reactivestreams.client.MongoClient` is the entry point to the reactive MongoDB driver API, connecting to a specific MongoDB database instance requires additional information such as the database name. With that information you can obtain a `com.mongodb.reactivestreams.client.MongoDatabase` object and access all the functionality of a specific MongoDB database instance. Spring provides the `org.springframework.data.mongodb.core.ReactiveMongoDatabaseFactory` interface shown below to bootstrap connectivity to the database. +While `com.mongodb.reactivestreams.client.MongoClient` is the entry point to the reactive MongoDB driver API, connecting to a specific MongoDB database instance requires additional information, such as the database name. With that information, you can obtain a `com.mongodb.reactivestreams.client.MongoDatabase` object and access all the functionality of a specific MongoDB database instance. Spring provides the `org.springframework.data.mongodb.core.ReactiveMongoDatabaseFactory` interface to bootstrap connectivity to the database. The following listing shows the `ReactiveMongoDatabaseFactory` interface: [source,java] ---- @@ -227,9 +227,9 @@ public interface ReactiveMongoDatabaseFactory { } ---- -The class `org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory` provides implements the ReactiveMongoDatabaseFactory interface and is created with a standard `com.mongodb.reactivestreams.client.MongoClient` instance and the database name. +The `org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory` class implements the `ReactiveMongoDatabaseFactory` interface and is created with a standard `com.mongodb.reactivestreams.client.MongoClient` instance and the database name. -Instead of using the IoC container to create an instance of `ReactiveMongoTemplate`, you can just use them in standard Java code as shown below. +Instead of using the IoC container to create an instance of `ReactiveMongoTemplate`, you can use them in standard Java code, as follows: [source,java] ---- @@ -253,9 +253,9 @@ public class MongoApp { The use of `SimpleMongoDbFactory` is the only difference between the listing shown in the <>. [[mongo.reactive.mongo-db-factory-java]] -=== Registering a ReactiveMongoDatabaseFactory instance using Java based metadata +=== Registering a ReactiveMongoDatabaseFactory Instance by Using Java-based Metadata -To register a `ReactiveMongoDatabaseFactory` instance with the container, you write code much like what was highlighted in the previous code listing. A simple example is shown below +To register a `ReactiveMongoDatabaseFactory` instance with the container, you can write code much like what was highlighted in the previous code listing, as the following example shows: [source,java] ---- @@ -268,7 +268,7 @@ public class MongoConfiguration { } ---- -To define the username and password create MongoDB connection string and pass it into the factory method as shown below. This listing also shows using `ReactiveMongoDatabaseFactory` register an instance of `ReactiveMongoTemplate` with the container. +To define the username and password, create a MongoDB connection string and pass it into the factory method, as the next listing shows. The following listing also shows how to use `ReactiveMongoDatabaseFactory` to register an instance of `ReactiveMongoTemplate` with the container: [source,java] ---- @@ -286,30 +286,28 @@ public class MongoConfiguration { ---- [[mongo.reactive.template]] -== Introduction to ReactiveMongoTemplate +== Introduction to `ReactiveMongoTemplate` -The class `ReactiveMongoTemplate`, located in the package `org.springframework.data.mongodb`, is the central class of the Spring's Reactive MongoDB support providing a rich feature set to interact with the database. The template offers convenience operations to create, update, delete and query for MongoDB documents and provides a mapping between your domain objects and MongoDB documents. +The `ReactiveMongoTemplate` class, located in the `org.springframework.data.mongodb` package, is the central class of the Spring's Reactive MongoDB support and provides a rich feature set to interact with the database. The template offers convenience operations to create, update, delete, and query for MongoDB documents and provides a mapping between your domain objects and MongoDB documents. NOTE: Once configured, `ReactiveMongoTemplate` is thread-safe and can be reused across multiple instances. -The mapping between MongoDB documents and domain classes is done by delegating to an implementation of the interface `MongoConverter`. Spring provides a default implementation with `MongoMappingConverter`, but you can also write your own converter. Please refer to the section on MongoConverters for more detailed information. +The mapping between MongoDB documents and domain classes is done by delegating to an implementation of the `MongoConverter` interface. Spring provides a default implementation with `MongoMappingConverter`, but you can also write your own converter. See the <> for more detailed information. -The `ReactiveMongoTemplate` class implements the interface `ReactiveMongoOperations`. In as much as possible, the methods on `ReactiveMongoOperations` are named after methods available on the MongoDB driver `Collection` object as as to make the API familiar to existing MongoDB developers who are used to the driver API. For example, you will find methods such as "find", "findAndModify", "findOne", "insert", "remove", "save", "update" and "updateMulti". The design goal was to make it as easy as possible to transition between the use of the base MongoDB driver and `ReactiveMongoOperations`. A major difference in between the two APIs is that `ReactiveMongoOperations` can be passed domain objects instead of `Document` and there are fluent APIs for `Query`, `Criteria`, and `Update` operations instead of populating a `Document` to specify the parameters for those operations. +The `ReactiveMongoTemplate` class implements the `ReactiveMongoOperations` interface. As much as possible, the methods on `ReactiveMongoOperations` mirror methods available on the MongoDB driver `Collection` object, to make the API familiar to existing MongoDB developers who are used to the driver API. For example, you can find methods such as `find`, `findAndModify`, `findOne`, `insert`, `remove`, `save`, `update`, and `updateMulti`. The design goal is to make it as easy as possible to transition between the use of the base MongoDB driver and `ReactiveMongoOperations`. A major difference between the two APIs is that `ReactiveMongoOperations` can be passed domain objects instead of `Document`, and there are fluent APIs for `Query`, `Criteria`, and `Update` operations instead of populating a `Document` to specify the parameters for those operations. -NOTE: The preferred way to reference the operations on `ReactiveMongoTemplate` instance is via its interface `ReactiveMongoOperations`. +NOTE: The preferred way to reference the operations on `ReactiveMongoTemplate` instance is through its `ReactiveMongoOperations` interface. -The default converter implementation used by `ReactiveMongoTemplate` is `MappingMongoConverter`. While the `MappingMongoConverter` can make use of additional metadata to specify the mapping of objects to documents it is also capable of converting objects that contain no additional metadata by using some conventions for the mapping of IDs and collection names. These conventions as well as the use of mapping annotations is explained in the <>. +The default converter implementation used by `ReactiveMongoTemplate` is `MappingMongoConverter`. While the `MappingMongoConverter` can use additional metadata to specify the mapping of objects to documents, it can also convert objects that contain no additional metadata by using some conventions for the mapping of IDs and collection names. These conventions as well as the use of mapping annotations are explained in the <>. -Another central feature of `ReactiveMongoTemplate` is exception translation of exceptions thrown in the MongoDB Java driver into Spring's portable Data Access Exception hierarchy. Refer to the section on <> for more information. +Another central feature of `ReactiveMongoTemplate` is exception translation of exceptions thrown in the MongoDB Java driver into Spring's portable Data Access Exception hierarchy. See the section on <> for more information. -While there are many convenience methods on `ReactiveMongoTemplate` to help you easily perform common tasks if you should need to access the MongoDB driver API directly to access functionality not explicitly exposed by the MongoTemplate you can use one of several Execute callback methods to access underlying driver APIs. The execute callbacks will give you a reference to either a `com.mongodb.reactivestreams.client.MongoCollection` or a `com.mongodb.reactivestreams.client.MongoDatabase` object. Please see the section <> for more information. - -Now let's look at a examples of how to work with the `ReactiveMongoTemplate` in the context of the Spring container. +There are many convenience methods on `ReactiveMongoTemplate` to help you easily perform common tasks. However, if you need to access the MongoDB driver API directly to access functionality not explicitly exposed by the MongoTemplate, you can use one of several `execute` callback methods to access underlying driver APIs. The `execute` callbacks give you a reference to either a `com.mongodb.reactivestreams.client.MongoCollection` or a `com.mongodb.reactivestreams.client.MongoDatabase` object. See <> for more information. [[mongo.reactive.template.instantiating]] === Instantiating ReactiveMongoTemplate -You can use Java to create and register an instance of `ReactiveMongoTemplate` as shown below. +You can use Java to create and register an instance of `ReactiveMongoTemplate`, as follows: .Registering a `com.mongodb.reactivestreams.client.MongoClient` object and enabling Spring's exception translation support ==== @@ -329,33 +327,37 @@ public class AppConfig { ---- ==== -There are several overloaded constructors of `ReactiveMongoTemplate`. These are +There are several overloaded constructors of `ReactiveMongoTemplate`, including: -* `ReactiveMongoTemplate(MongoClient mongo, String databaseName)` - takes the `com.mongodb.MongoClient` object and the default database name to operate against. -* `ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory)` - takes a ReactiveMongoDatabaseFactory object that encapsulated the `com.mongodb.reactivestreams.client.MongoClient` object and database name. -* `ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory, MongoConverter mongoConverter)` - adds a `MongoConverter` to use for mapping. +* `ReactiveMongoTemplate(MongoClient mongo, String databaseName)`: Takes the `com.mongodb.MongoClient` object and the default database name to operate against. +* `ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory)`: Takes a `ReactiveMongoDatabaseFactory` object that encapsulated the `com.mongodb.reactivestreams.client.MongoClient` object and database name. +* `ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory, MongoConverter mongoConverter)`: Adds a `MongoConverter` to use for mapping. -Other optional properties that you might like to set when creating a `ReactiveMongoTemplate` are the default `WriteResultCheckingPolicy`, `WriteConcern`, and `ReadPreference`. +When creating a `ReactiveMongoTemplate`, you might also want to set the following properties: -NOTE: The preferred way to reference the operations on `ReactiveMongoTemplate` instance is via its interface `ReactiveMongoOperations`. +* `WriteResultCheckingPolicy` +* `WriteConcern` +* `ReadPreference` + +NOTE: The preferred way to reference the operations on `ReactiveMongoTemplate` instance is through its `ReactiveMongoOperations` interface. [[mongo.reactive.template.writeresultchecking]] -=== WriteResultChecking Policy +=== `WriteResultChecking` Policy -When in development it is very handy to either log or throw an `Exception` if the `com.mongodb.WriteResult` returned from any MongoDB operation contains an error. It is quite common to forget to do this during development and then end up with an application that looks like it runs successfully but in fact the database was not modified according to your expectations. Set MongoTemplate's property to an enum with the following values, `LOG`, `EXCEPTION`, or `NONE` to either log the error, throw and exception or do nothing. The default is to use a `WriteResultChecking` value of `NONE`. +When in development, it is handy to either log or throw an `Exception` if the `com.mongodb.WriteResult` returned from any MongoDB operation contains an error. It is quite common to forget to do this during development and then end up with an application that looks like it runs successfully when, in fact, the database was not modified according to your expectations. Set the `MongoTemplate` `WriteResultChecking` property to an enum with the following values, `LOG`, `EXCEPTION`, or `NONE` to either log the error, throw and exception or do nothing. The default is to use a `WriteResultChecking` value of `NONE`. [[mongo.reactive.template.writeconcern]] -=== WriteConcern +=== `WriteConcern` -You can set the `com.mongodb.WriteConcern` property that the `ReactiveMongoTemplate` will use for write operations if it has not yet been specified via the driver at a higher level such as `MongoDatabase`. If ReactiveMongoTemplate's `WriteConcern` property is not set it will default to the one set in the MongoDB driver's `MongoDatabase` or `MongoCollection` setting. +If it has not yet been specified through the driver at a higher level (such as `MongoDatabase`), you can set the `com.mongodb.WriteConcern` property that the `ReactiveMongoTemplate` uses for write operations. If ReactiveMongoTemplate's `WriteConcern` property is not set, it defaults to the one set in the MongoDB driver's `MongoDatabase` or `MongoCollection` setting. [[mongo.reactive.template.writeconcernresolver]] -=== WriteConcernResolver +=== `WriteConcernResolver` -For more advanced cases where you want to set different `WriteConcern` values on a per-operation basis (for remove, update, insert and save operations), a strategy interface called `WriteConcernResolver` can be configured on `ReactiveMongoTemplate`. Since `ReactiveMongoTemplate` is used to persist POJOs, the `WriteConcernResolver` lets you create a policy that can map a specific POJO class to a `WriteConcern` value. The `WriteConcernResolver` interface is shown below. +For more advanced cases where you want to set different `WriteConcern` values on a per-operation basis (for remove, update, insert, and save operations), a strategy interface called `WriteConcernResolver` can be configured on `ReactiveMongoTemplate`. Since `ReactiveMongoTemplate` is used to persist POJOs, the `WriteConcernResolver` lets you create a policy that can map a specific POJO class to a `WriteConcern` value. The following listing shows the `WriteConcernResolver` interface: [source,java] ---- @@ -364,7 +366,7 @@ public interface WriteConcernResolver { } ---- -The passed in argument, `MongoAction`, is what you use to determine the `WriteConcern` value to be used or to use the value of the Template itself as a default. `MongoAction` contains the collection name being written to, the `java.lang.Class` of the POJO, the converted `DBObject`, as well as the operation as an enumeration (`MongoActionOperation`: REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE) and a few other pieces of contextual information. For example, +The argument, `MongoAction`, determines the `WriteConcern` value to be used and whether to use the value of the template itself as a default. `MongoAction` contains the collection name being written to, the `java.lang.Class` of the POJO, the converted `DBObject`, the operation as a value from the `MongoActionOperation` enumeration (one of `REMOVE`, `UPDATE`, `INSERT`, `INSERT_LIST`, and `SAVE`), and a few other pieces of contextual information. The following example shows how to create a `WriteConcernResolver`: [source] ---- @@ -385,9 +387,9 @@ private class MyAppWriteConcernResolver implements WriteConcernResolver { [[mongo.reactive.template.save-update-remove]] == Saving, Updating, and Removing Documents -`ReactiveMongoTemplate` provides a simple way for you to save, update, and delete your domain objects and map those objects to documents stored in MongoDB. +`ReactiveMongoTemplate` lets you save, update, and delete your domain objects and map those objects to documents stored in MongoDB. -Given a simple class such as Person +Consider the following `Person` class: [source,java] ---- @@ -420,7 +422,7 @@ public class Person { } ---- -You can save, update and delete the object as shown below. +The following listing shows how you can save, update, and delete the `Person` object: [source,java] ---- @@ -451,26 +453,26 @@ public class ReactiveMongoApp { } ---- -There was implicit conversion using the `MongoConverter` between a `String` and `ObjectId` as stored in the database and recognizing a convention of the property "Id" name. +The preceding example includes implicit conversion between a `String` and `ObjectId` (by using the `MongoConverter`) as stored in the database and recognizing a convention of the property `Id` name. -NOTE: This example is meant to show the use of save, update and remove operations on `ReactiveMongoTemplate` and not to show complex mapping or functional chaining functionality +NOTE: The preceding example is meant to show the use of save, update, and remove operations on `ReactiveMongoTemplate` and not to show complex mapping or chaining functionality. -The query syntax used in the example is explained in more detail in the section <>. Additional documentation can be found in <> section. +"`<>`" explains the query syntax used in the preceding example in more detail. Additional documentation can be found in <> section. [[mongo.reactive.executioncallback]] -== Execution callbacks +== Execution Callbacks One common design feature of all Spring template classes is that all functionality is routed into one of the templates execute callback methods. This helps ensure that exceptions and any resource management that maybe required are performed consistency. While this was of much greater need in the case of JDBC and JMS than with MongoDB, it still offers a single spot for exception translation and logging to occur. As such, using the execute callback is the preferred way to access the MongoDB driver's `MongoDatabase` and `MongoCollection` objects to perform uncommon operations that were not exposed as methods on `ReactiveMongoTemplate`. Here is a list of execute callback methods. -* ` Flux` *execute* `(Class entityClass, ReactiveCollectionCallback action)` Executes the given ReactiveCollectionCallback for the entity collection of the specified class. +* ` Flux` *execute* `(Class entityClass, ReactiveCollectionCallback action)`: Executes the given `ReactiveCollectionCallback` for the entity collection of the specified class. -* ` Flux` *execute* `(String collectionName, ReactiveCollectionCallback action)` Executes the given ReactiveCollectionCallback on the collection of the given name. +* ` Flux` *execute* `(String collectionName, ReactiveCollectionCallback action)`: Executes the given `ReactiveCollectionCallback` on the collection of the given name. -* ` Flux` *execute* `(ReactiveDatabaseCallback action)` Executes a ReactiveDatabaseCallback translating any exceptions as necessary. +* ` Flux` *execute* `(ReactiveDatabaseCallback action)`: Executes a `ReactiveDatabaseCallback` translating any exceptions as necessary. -Here is an example that uses the `ReactiveCollectionCallback` to return information about an index +The following example uses the `ReactiveCollectionCallback` to return information about an index: [source,java] ----