Compare commits

...

68 Commits

Author SHA1 Message Date
Mark Paluch
4fbf01467e Release version 3.2.6 (2021.0.6).
See #3828
2021-10-18 11:11:44 +02:00
Mark Paluch
012d1245b0 Prepare 3.2.6 (2021.0.6).
See #3828
2021-10-18 11:10:48 +02:00
Mark Paluch
d8eb0f124a After release cleanups.
See #3769
2021-09-17 09:27:38 +02:00
Mark Paluch
bfeb896c70 Prepare next development iteration.
See #3769
2021-09-17 09:27:35 +02:00
Mark Paluch
efffc936fa Release version 3.2.5 (2021.0.5).
See #3769
2021-09-17 09:18:31 +02:00
Mark Paluch
563a3fb845 Prepare 3.2.5 (2021.0.5).
See #3769
2021-09-17 09:17:28 +02:00
Christoph Strobl
50ae6fd045 Change visibility of PersistentEntitiesFactoryBean.
Closes: #3825
2021-09-15 15:30:10 +02:00
Christoph Strobl
ae0e240334 Move and add tests to UpdateMapper.
Also update author information.

Original Pull Request: #3815
2021-09-13 14:36:15 +02:00
divyajnu08
852a461429 Fix update mapping using nested integer keys on map structures.
Closes: #3775
Original Pull Request: #3815
2021-09-13 14:36:05 +02:00
Mark Paluch
2cbed2a052 Upgrade to Maven Wrapper 3.8.2.
See #3819
2021-09-10 15:39:33 +02:00
Mark Paluch
95667edec3 Reduce allocations in query and update mapping.
Introduce EmptyDocument and utility methods in BsonUtils. Avoid entrySet and iterator creation for document iterations/inspections.

Original Pull Request: #3809
2021-09-09 08:21:22 +02:00
Mark Paluch
c1a52de8e5 Introduce SessionSynchronization.NEVER to disable transactional participation.
SessionSynchronization.NEVER bypasses all transactional integration in cases where applications do not want to make use of transactions so that transaction inspection overhead is avoided.

Original Pull Request: #3809
2021-09-09 08:14:16 +02:00
Christoph Strobl
7e94c1bdc3 Fix slice argument in query fields projection.
We now use a Collection instead of an Array to pass on $slice projection values for offset and limit.

Closes: #3811
Original pull request: #3812.
2021-09-08 14:47:41 +02:00
Mark Paluch
c3259e395c Avoid nested Document conversion to primitive types for fields with an explicit write target.
We now no longer attempt to convert query Documents into primitive types to avoid e.g. Document to String conversion.

Closes: #3783
Original Pull Request: #3797
2021-09-02 10:23:14 +02:00
Mark Paluch
3526b6a2d8 Polishing.
Reorder methods. Add since tag. Simplify assertions. Use diamond syntax.

See: #3776
Original pull request: #3777.
2021-08-25 14:58:41 +02:00
Ivan Volzhev
cb70a97ea8 Relax requirement for GeoJsonMultiPoint construction allowing creation using a single point.
Only 1 point is required per GeoJson RFC and Mongo works just fine with 1 point as well.

Closes #3776
Original pull request: #3777.
2021-08-25 14:58:41 +02:00
Mark Paluch
52415bc702 Polishing.
Update since version. Reformat code.

See: #3761.
2021-08-25 14:33:45 +02:00
sangyongchoi
43de140842 Add Criteria infix functions for maxDistance and minDistance.
Closes: #3761
2021-08-25 14:33:45 +02:00
Mark Paluch
15b000ecce Polishing.
Fix typo in reference docs.

See #3758
2021-08-25 10:15:19 +02:00
Ryan Gibb
e428b9b977 Fix a typo in MongoConverter javadoc.
Original pull request: #3758.
2021-08-25 10:15:19 +02:00
Mark Paluch
6e38610ac1 Polishing.
Fix asterisk callouts.

See #3786
2021-08-24 11:19:41 +02:00
Mark Paluch
e7af70efca Extract Aggregation Framework and GridFS docs in own source files.
Closes #3786
2021-08-24 11:08:52 +02:00
Christoph Strobl
39f5f91261 Change visibility of Reactive/MongoRepositoryFactoryBean setters.
Setters of the FactoryBean should be public.

Closes: #3779
Original pull request: #3780.
2021-08-24 10:26:59 +02:00
Jens Schauder
c48daa6d56 After release cleanups.
See #3735
2021-08-12 11:37:31 +02:00
Jens Schauder
11356cd20f Prepare next development iteration.
See #3735
2021-08-12 11:37:29 +02:00
Jens Schauder
7385262c47 Release version 3.2.4 (2021.0.4).
See #3735
2021-08-12 11:22:50 +02:00
Jens Schauder
259938588a Prepare 3.2.4 (2021.0.4).
See #3735
2021-08-12 11:22:26 +02:00
Mark Paluch
a1b4e3fc55 Run unpaged query using Pageable.unpaged() through QuerydslMongoPredicateExecutor.findAll(…).
We now correctly consider unpaged queries if the Pageable is unpaged.

Closes: #3751
Original Pull Request: #3754
2021-07-26 15:16:30 +02:00
Jens Schauder
76479820bc After release cleanups.
See #3682
2021-07-16 11:51:05 +02:00
Jens Schauder
c47bbc4a20 Prepare next development iteration.
See #3682
2021-07-16 11:51:02 +02:00
Jens Schauder
74791d0bca Release version 3.2.3 (2021.0.3).
See #3682
2021-07-16 11:35:22 +02:00
Jens Schauder
f4d2287011 Prepare 3.2.3 (2021.0.3).
See #3682
2021-07-16 11:34:27 +02:00
Jens Schauder
ab6ba194c1 Updated changelog.
See #3682
2021-07-16 11:34:20 +02:00
Mark Paluch
595a346705 Polishing.
Support DBObject and Map that as source for entity materialization and map conversion.

See #3702
Original pull request: #3704.
2021-07-15 10:00:00 +02:00
Christoph Strobl
08c5e5a810 Fix raw document conversion in Collection like properties.
Along the lines make sure to convert map like structures correctly if they do not come as a Document, eg. cause they got converted to a plain Map in a post load, pre convert event.

Closes #3702
Original pull request: #3704.
2021-07-15 09:59:50 +02:00
Christoph Strobl
f987217c3c Custom Converter should also be applicable for simple types.
This commit fixes a regression that prevented custom converters from being applied to types considered store native ones.

Original pull request: #3703.
Fixes #3670
2021-07-15 09:00:35 +02:00
Christoph Strobl
92a22978c2 Polishing.
Simplify KeyMapper current property/index setup.

Original Pull Request: #3689
2021-07-06 12:56:10 +02:00
David Julia
2e2e076b5b Fix Regression in generating queries with nested maps with numeric keys.
While maps that have numeric keys work if there is only one map with an integer key, when there are multiple maps with numeric keys in a given query, it fails.

Take the following example for a map called outer with numeric keys holding reference to another object with a map called inner with numeric keys: Updates that are meant to generate {"$set": {"outerMap.1234.inner.5678": "hello"}} are instead generating {"$set": {"outerMap.1234.inner.inner": "hello"}}, repeating the later map property name instead of using the integer key value.

This commit adds unit tests both for the UpdateMapper and QueryMapper, which check multiple consecutive maps with numeric keys, and adds a fix in the KeyMapper. Because we cannot easily change the path parsing to somehow parse path parts corresponding to map keys differently, we address the issue in the KeyMapper. We keep track of the partial path corresponding to the current property and use it to skip adding the duplicated property name for the map to the query, and instead add the key.

This is a bit redundant in that we now have both an iterator and an index-based way of accessing the path parts, but it gets the tests passing and fixes the issue without making a large change to the current approach.

Fixes: #3688
Original Pull Request: #3689
2021-07-06 12:05:58 +02:00
Christoph Strobl
0c50d97887 Fix NPE when reading/mapping null value inside collection.
Closes: #3686
2021-07-01 11:16:13 +02:00
Christoph Strobl
c10d4b6af0 Favor ObjectUtils over Objects for equals/hashCode.
Original Pull Request: #3684
2021-06-24 13:43:55 +02:00
Gatto
6644ac6875 Add equals and hashCode to UnwrappedMongoPersistentProperty.
Fixes #3683
Original Pull Request: #3684
2021-06-24 13:42:12 +02:00
Mark Paluch
708def0df1 After release cleanups.
See #3650
2021-06-22 16:05:22 +02:00
Mark Paluch
889e5d52bb Prepare next development iteration.
See #3650
2021-06-22 16:05:19 +02:00
Mark Paluch
8930091b33 Release version 3.2.2 (2021.0.2).
See #3650
2021-06-22 15:52:27 +02:00
Mark Paluch
b1d750efed Prepare 3.2.2 (2021.0.2).
See #3650
2021-06-22 15:51:38 +02:00
Mark Paluch
7a19593f02 Updated changelog.
See #3650
2021-06-22 15:51:33 +02:00
Mark Paluch
9021445ccd Updated changelog.
See #3649
2021-06-22 15:29:52 +02:00
Mark Paluch
db92c37502 Update reference docs to use correct MongoClient.
Closes #3666
2021-06-22 14:37:02 +02:00
larsw
99e5e2596e Add closing quote to GeoJson javadoc.
Closes #3677
2021-06-21 13:58:40 +02:00
Christoph Strobl
d0bf0e2e62 Fix field projection value conversion.
The field projection conversion should actually only map field names and avoid value conversion. In the MongoId case an inclusion parameter (1) was unintentionally converted into its String representation which causes trouble on Mongo 4.4 servers.

Fixes: #3668
Original pull request: #3678.
2021-06-21 13:45:54 +02:00
Christoph Strobl
28efb3afbe Polishing.
Fix typo in class name and make sure MongoTestTemplate uses the configured simple types.

See: #3659
Original pull request: #3661.
2021-06-18 14:27:18 +02:00
Christoph Strobl
99eb849c93 Fix query mapper path resolution for types considered simple ones.
spring-projects/spring-data-commons#2293 changed how PersistentProperty paths get resolved and considers potentially registered converters for those, which made the path resolution fail in during the query mapping process.
This commit makes sure to capture the according exception and continue with the given user input.

Fixes: #3659
Original pull request: #3661.
2021-06-18 14:13:33 +02:00
Christoph Strobl
d33aa682e5 Fix $or / $nor keyword mapping in query mapper.
This commit fixes an issue with the pattern used for detecting $or / $nor which also matched other keywords like $floor.

Closes: #3635
Original pull request: #3637.
2021-06-18 13:49:37 +02:00
Mark Paluch
f6db089f6f Polishing.
Add nullability annotation. Return early on null value conversion.

See #3633
Original pull request: #3643.
2021-06-09 14:14:49 +02:00
Christoph Strobl
13ae5e17bb Fix NPE in QueryMapper when trying to apply target type on null value.
Closes #3633
Original pull request: #3643.
2021-06-09 14:14:44 +02:00
Mark Paluch
ee203bf22a Polishing.
Reformat code.

See #3660.
Original pull request: #3662.
2021-06-09 11:34:20 +02:00
Christoph Strobl
990696ba11 Fix conversion for types having a converter registered.
Fixes: #3660
Original pull request: #3662.
2021-06-09 11:33:34 +02:00
Mark Paluch
4bc2f108fe After release cleanups.
See #3629
2021-05-14 12:34:20 +02:00
Mark Paluch
5064ba5b24 Prepare next development iteration.
See #3629
2021-05-14 12:34:16 +02:00
Mark Paluch
1179ded140 Release version 3.2.1 (2021.0.1).
See #3629
2021-05-14 12:23:55 +02:00
Mark Paluch
e12700c00b Prepare 3.2.1 (2021.0.1).
See #3629
2021-05-14 12:23:21 +02:00
Mark Paluch
0507adab20 Updated changelog.
See #3629
2021-05-14 12:23:14 +02:00
Mark Paluch
375ddf8afb Updated changelog.
See #3628
2021-05-14 12:06:39 +02:00
Mark Paluch
283cf06dc1 Introduce template method for easier customization of fragments.
Closes #3638.
2021-04-27 10:46:01 +02:00
Greg L. Turnquist
10a8456581 Authenticate with artifactory.
See #3616.
2021-04-22 15:01:43 -05:00
Clément Petit
e751a43cdf Fix bullet points in aggregation framework reference documentation.
Closes: #3632
2021-04-20 08:22:33 +02:00
Mark Paluch
c987ba5f83 After release cleanups.
See #3616
2021-04-14 14:30:40 +02:00
Mark Paluch
950bae0306 Prepare next development iteration.
See #3616
2021-04-14 14:30:36 +02:00
56 changed files with 2421 additions and 973 deletions

View File

@@ -1 +1,2 @@
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.5.4/apache-maven-3.5.4-bin.zip
#Fri Sep 10 15:39:33 CEST 2021
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.2/apache-maven-3.8.2-bin.zip

34
Jenkinsfile vendored
View File

@@ -3,7 +3,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/master", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/2.5.x", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -68,7 +68,7 @@ pipeline {
stage("test: baseline (jdk8)") {
when {
anyOf {
branch 'master'
branch '3.2.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -76,6 +76,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -85,7 +88,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -95,7 +98,7 @@ pipeline {
stage("Test other configurations") {
when {
allOf {
branch 'master'
branch '3.2.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -105,6 +108,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -114,7 +120,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -126,6 +132,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -135,7 +144,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -147,6 +156,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -156,7 +168,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pjava11 clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pjava11 clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -168,7 +180,7 @@ pipeline {
stage('Release to artifactory') {
when {
anyOf {
branch 'master'
branch '3.2.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -185,7 +197,7 @@ pipeline {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
@@ -201,7 +213,7 @@ pipeline {
stage('Publish documentation') {
when {
branch 'master'
branch '3.2.x'
}
agent {
label 'data'
@@ -216,7 +228,7 @@ pipeline {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,distribute ' +
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,distribute ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +

202
LICENSE.txt Normal file
View File

@@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
https://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright {yyyy} {name of copyright owner}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

11
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.0</version>
<version>3.2.6</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.5.0</version>
<version>2.5.6</version>
</parent>
<modules>
@@ -26,7 +26,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.5.0</springdata.commons>
<springdata.commons>2.5.6</springdata.commons>
<mongo>4.2.3</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
@@ -158,11 +158,6 @@
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</pluginRepository>
<pluginRepository>
<id>bintray-plugins</id>
<name>bintray-plugins</name>
<url>https://jcenter.bintray.com</url>
</pluginRepository>
</pluginRepositories>
</project>

29
settings.xml Normal file
View File

@@ -0,0 +1,29 @@
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
https://maven.apache.org/xsd/settings-1.0.0.xsd">
<servers>
<server>
<id>spring-plugins-release</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-snapshot</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-milestone</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-release</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
</servers>
</settings>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.0</version>
<version>3.2.6</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.0</version>
<version>3.2.6</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.2.0</version>
<version>3.2.6</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -104,7 +104,8 @@ public class MongoDatabaseUtils {
Assert.notNull(factory, "Factory must not be null!");
if (!TransactionSynchronizationManager.isSynchronizationActive()) {
if (sessionSynchronization == SessionSynchronization.NEVER
|| !TransactionSynchronizationManager.isSynchronizationActive()) {
return StringUtils.hasText(dbName) ? factory.getMongoDatabase(dbName) : factory.getMongoDatabase();
}

View File

@@ -138,6 +138,10 @@ public class ReactiveMongoDatabaseUtils {
Assert.notNull(factory, "DatabaseFactory must not be null!");
if (sessionSynchronization == SessionSynchronization.NEVER) {
return getMongoDatabaseOrDefault(dbName, factory);
}
return TransactionSynchronizationManager.forCurrentTransaction()
.filter(TransactionSynchronizationManager::isSynchronizationActive) //
.flatMap(synchronizationManager -> {

View File

@@ -15,13 +15,20 @@
*/
package org.springframework.data.mongodb;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
/**
* {@link SessionSynchronization} is used along with {@link org.springframework.data.mongodb.core.MongoTemplate} to
* define in which type of transactions to participate if any.
* {@link SessionSynchronization} is used along with {@code MongoTemplate} to define in which type of transactions to
* participate if any.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see MongoTemplate#setSessionSynchronization(SessionSynchronization)
* @see MongoDatabaseUtils#getDatabase(MongoDatabaseFactory, SessionSynchronization)
* @see ReactiveMongoTemplate#setSessionSynchronization(SessionSynchronization)
* @see ReactiveMongoDatabaseUtils#getDatabase(ReactiveMongoDatabaseFactory, SessionSynchronization)
*/
public enum SessionSynchronization {
@@ -34,5 +41,12 @@ public enum SessionSynchronization {
/**
* Synchronize with native MongoDB transactions initiated via {@link MongoTransactionManager}.
*/
ON_ACTUAL_TRANSACTION;
ON_ACTUAL_TRANSACTION,
/**
* Do not participate in ongoing transactions.
*
* @since 3.2.5
*/
NEVER;
}

View File

@@ -28,7 +28,7 @@ import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
* @author Christoph Strobl
* @since 3.1
*/
class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEntities> {
public class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEntities> {
private final MappingMongoConverter converter;

View File

@@ -156,5 +156,14 @@ public class MappedDocument {
public List<ArrayFilter> getArrayFilters() {
return delegate.getArrayFilters();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#hasArrayFilters()
*/
@Override
public boolean hasArrayFilters() {
return delegate.hasArrayFilters();
}
}
}

View File

@@ -613,7 +613,7 @@ class QueryOperations {
UpdateContext(MappedDocument update, boolean upsert) {
super(new BasicQuery(new Document(BsonUtils.asMap(update.getIdFilter()))));
super(new BasicQuery(BsonUtils.asDocument(update.getIdFilter())));
this.multi = false;
this.upsert = upsert;
this.mappedDocument = update;

View File

@@ -135,7 +135,7 @@ class DocumentAccessor {
*/
@Nullable
public Object getRawId(MongoPersistentEntity<?> entity) {
return entity.hasIdProperty() ? get(entity.getRequiredIdProperty()) : BsonUtils.asMap(document).get("_id");
return entity.hasIdProperty() ? get(entity.getRequiredIdProperty()) : BsonUtils.get(document, "_id");
}
/**

View File

@@ -25,7 +25,6 @@ import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Optional;
import java.util.Set;
@@ -141,7 +140,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this::getWriteTarget);
this.idMapper = new QueryMapper(this);
this.spELContext = new SpELContext(DocumentPropertyAccessor.INSTANCE);
this.dbRefProxyHandler = new DefaultDbRefProxyHandler(spELContext, mappingContext,
(prop, bson, evaluator, path) -> {
@@ -161,8 +159,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(path, "ObjectPath must not be null");
return new ConversionContext(path, this::readDocument, this::readCollectionOrArray, this::readMap, this::readDBRef,
this::getPotentiallyConvertedSimpleRead);
return new ConversionContext(conversions, path, this::readDocument, this::readCollectionOrArray, this::readMap,
this::readDBRef, this::getPotentiallyConvertedSimpleRead);
}
/**
@@ -376,8 +374,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
private <S> S populateProperties(ConversionContext context, MongoPersistentEntity<S> entity,
DocumentAccessor documentAccessor,
SpELExpressionEvaluator evaluator, S instance) {
DocumentAccessor documentAccessor, SpELExpressionEvaluator evaluator, S instance) {
PersistentPropertyAccessor<S> accessor = new ConvertingPropertyAccessor<>(entity.getPropertyAccessor(instance),
conversionService);
@@ -423,8 +420,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@Nullable
private Object readIdValue(ConversionContext context, SpELExpressionEvaluator evaluator,
MongoPersistentProperty idProperty,
Object rawId) {
MongoPersistentProperty idProperty, Object rawId) {
String expression = idProperty.getSpelExpression();
Object resolvedValue = expression != null ? evaluator.evaluate(expression) : rawId;
@@ -434,8 +430,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private void readProperties(ConversionContext context, MongoPersistentEntity<?> entity,
PersistentPropertyAccessor<?> accessor, DocumentAccessor documentAccessor,
MongoDbPropertyValueProvider valueProvider,
SpELExpressionEvaluator evaluator) {
MongoDbPropertyValueProvider valueProvider, SpELExpressionEvaluator evaluator) {
DbRefResolverCallback callback = null;
@@ -505,8 +500,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@Nullable
private Object readUnwrapped(ConversionContext context, DocumentAccessor documentAccessor,
MongoPersistentProperty prop,
MongoPersistentEntity<?> unwrappedEntity) {
MongoPersistentProperty prop, MongoPersistentEntity<?> unwrappedEntity) {
if (prop.findAnnotation(Unwrapped.class).onEmpty().equals(OnEmpty.USE_EMPTY)) {
return read(context, unwrappedEntity, (Document) documentAccessor.getDocument());
@@ -1045,7 +1039,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(Object value, @Nullable Class<?> target) {
if (target == null || ClassUtils.isAssignableValue(target, value)) {
if (target == null) {
return value;
}
@@ -1053,6 +1047,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return doConvert(value, target);
}
if (ClassUtils.isAssignableValue(target, value)) {
return value;
}
if (Enum.class.isAssignableFrom(target)) {
return Enum.valueOf((Class<Enum>) target, value.toString());
}
@@ -1141,7 +1139,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
for (Object element : source) {
items.add(context.convert(element, componentType));
items.add(element != null ? context.convert(element, componentType) : element);
}
return getPotentiallyConvertedSimpleRead(items, targetType.getType());
@@ -1193,21 +1191,22 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return map;
}
for (Entry<String, Object> entry : sourceMap.entrySet()) {
sourceMap.forEach((k, v) -> {
if (typeMapper.isTypeKey(entry.getKey())) {
continue;
if (typeMapper.isTypeKey(k)) {
return;
}
Object key = potentiallyUnescapeMapKey(entry.getKey());
Object key = potentiallyUnescapeMapKey(k);
if (!rawKeyType.isAssignableFrom(key.getClass())) {
key = doConvert(key, rawKeyType);
}
Object value = entry.getValue();
map.put(key, context.convert(value, valueType));
}
Object value = v;
map.put(key, value == null ? value : context.convert(value, valueType));
});
return map;
}
@@ -1447,8 +1446,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
T target = null;
if (document != null) {
maybeEmitEvent(
new AfterLoadEvent<>(document, (Class<T>) type.getType(), collectionName));
maybeEmitEvent(new AfterLoadEvent<>(document, (Class<T>) type.getType(), collectionName));
target = (T) readDocument(context, document, type);
}
@@ -1541,9 +1539,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
@SuppressWarnings("ConstantConditions")
private <T extends Object> T doConvert(Object value, Class<? extends T> target, @Nullable Class<? extends T> fallback) {
private <T extends Object> T doConvert(Object value, Class<? extends T> target,
@Nullable Class<? extends T> fallback) {
if(conversionService.canConvert(value.getClass(), target) || fallback == null) {
if (conversionService.canConvert(value.getClass(), target) || fallback == null) {
return conversionService.convert(value, target);
}
return conversionService.convert(value, fallback);
@@ -1853,6 +1852,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
protected static class ConversionContext {
private final org.springframework.data.convert.CustomConversions conversions;
private final ObjectPath path;
private final ContainerValueConverter<Bson> documentConverter;
private final ContainerValueConverter<Collection<?>> collectionConverter;
@@ -1860,10 +1860,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private final ContainerValueConverter<DBRef> dbRefConverter;
private final ValueConverter<Object> elementConverter;
ConversionContext(ObjectPath path, ContainerValueConverter<Bson> documentConverter,
ContainerValueConverter<Collection<?>> collectionConverter, ContainerValueConverter<Bson> mapConverter,
ContainerValueConverter<DBRef> dbRefConverter, ValueConverter<Object> elementConverter) {
ConversionContext(org.springframework.data.convert.CustomConversions customConversions, ObjectPath path,
ContainerValueConverter<Bson> documentConverter, ContainerValueConverter<Collection<?>> collectionConverter,
ContainerValueConverter<Bson> mapConverter, ContainerValueConverter<DBRef> dbRefConverter,
ValueConverter<Object> elementConverter) {
this.conversions = customConversions;
this.path = path;
this.documentConverter = documentConverter;
this.collectionConverter = collectionConverter;
@@ -1882,8 +1884,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@SuppressWarnings("unchecked")
public <S extends Object> S convert(Object source, TypeInformation<? extends S> typeHint) {
Assert.notNull(source, "Source must not be null");
Assert.notNull(typeHint, "TypeInformation must not be null");
if (conversions.hasCustomReadTarget(source.getClass(), typeHint.getType())) {
return (S) elementConverter.convert(source, typeHint);
}
if (source instanceof Collection) {
Class<?> rawType = typeHint.getType();
@@ -1900,7 +1907,16 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (typeHint.isMap()) {
return (S) mapConverter.convert(this, (Bson) source, typeHint);
if(ClassUtils.isAssignable(Document.class, typeHint.getType())) {
return (S) documentConverter.convert(this, BsonUtils.asBson(source), typeHint);
}
if (BsonUtils.supportsBson(source)) {
return (S) mapConverter.convert(this, BsonUtils.asBson(source), typeHint);
}
throw new IllegalArgumentException(String.format("Expected map like structure but found %s", source.getClass()));
}
if (source instanceof DBRef) {
@@ -1912,8 +1928,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
String.format(INCOMPATIBLE_TYPES, source, BasicDBList.class, typeHint.getType(), getPath()));
}
if (source instanceof Bson) {
return (S) documentConverter.convert(this, (Bson) source, typeHint);
if (BsonUtils.supportsBson(source)) {
return (S) documentConverter.convert(this, BsonUtils.asBson(source), typeHint);
}
return (S) elementConverter.convert(source, typeHint);
@@ -1929,8 +1945,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(currentPath, "ObjectPath must not be null");
return new ConversionContext(currentPath, documentConverter, collectionConverter, mapConverter, dbRefConverter,
elementConverter);
return new ConversionContext(conversions, currentPath, documentConverter, collectionConverter, mapConverter,
dbRefConverter, elementConverter);
}
public ObjectPath getPath() {

View File

@@ -40,13 +40,14 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author Ryan Gibb
*/
public interface MongoConverter
extends EntityConverter<MongoPersistentEntity<?>, MongoPersistentProperty, Object, Bson>, MongoWriter<Object>,
EntityReader<Object, Bson> {
/**
* Returns thw {@link TypeMapper} being used to write type information into {@link Document}s created with that
* Returns the {@link TypeMapper} being used to write type information into {@link Document}s created with that
* converter.
*
* @return will never be {@literal null}.
@@ -139,6 +140,9 @@ public interface MongoConverter
if (ObjectId.isValid(id.toString())) {
return new ObjectId(id.toString());
}
// avoid ConversionException as convertToMongoType will return String anyways.
return id;
}
}

View File

@@ -19,11 +19,14 @@ import java.util.*;
import java.util.Map.Entry;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Example;
@@ -65,9 +68,13 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author David Julia
* @author Divya Srivastava
*/
public class QueryMapper {
protected static final Logger LOGGER = LoggerFactory.getLogger(QueryMapper.class);
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final Document META_TEXT_SCORE = new Document("$meta", "textScore");
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
@@ -186,24 +193,11 @@ public class QueryMapper {
Assert.notNull(sortObject, "SortObject must not be null!");
if (sortObject.isEmpty()) {
return new Document();
return BsonUtils.EMPTY_DOCUMENT;
}
sortObject = filterUnwrappedObjects(sortObject, entity);
Document mappedSort = new Document();
for (Map.Entry<String, Object> entry : BsonUtils.asMap(sortObject).entrySet()) {
Field field = createPropertyField(entity, entry.getKey(), mappingContext);
if (field.getProperty() != null && field.getProperty().isUnwrapped()) {
continue;
}
mappedSort.put(field.getMappedKey(), entry.getValue());
}
mapMetaAttributes(mappedSort, entity, MetaMapping.WHEN_PRESENT);
return mappedSort;
Document mappedSort = mapFieldsToPropertyNames(sortObject, entity);
return mapMetaAttributes(mappedSort, entity, MetaMapping.WHEN_PRESENT);
}
/**
@@ -219,26 +213,52 @@ public class QueryMapper {
Assert.notNull(fieldsObject, "FieldsObject must not be null!");
fieldsObject = filterUnwrappedObjects(fieldsObject, entity);
Document mappedFields = getMappedObject(fieldsObject, entity);
mapMetaAttributes(mappedFields, entity, MetaMapping.FORCE);
return mappedFields;
Document mappedFields = mapFieldsToPropertyNames(fieldsObject, entity);
return mapMetaAttributes(mappedFields, entity, MetaMapping.FORCE);
}
private void mapMetaAttributes(Document source, @Nullable MongoPersistentEntity<?> entity, MetaMapping metaMapping) {
private Document mapFieldsToPropertyNames(Document fields, @Nullable MongoPersistentEntity<?> entity) {
if (fields.isEmpty()) {
return BsonUtils.EMPTY_DOCUMENT;
}
Document target = new Document();
BsonUtils.asMap(filterUnwrappedObjects(fields, entity)).forEach((k, v) -> {
Field field = createPropertyField(entity, k, mappingContext);
if (field.getProperty() != null && field.getProperty().isUnwrapped()) {
return;
}
target.put(field.getMappedKey(), v);
});
return target;
}
private Document mapMetaAttributes(Document source, @Nullable MongoPersistentEntity<?> entity,
MetaMapping metaMapping) {
if (entity == null) {
return;
return source;
}
if (entity.hasTextScoreProperty() && !MetaMapping.IGNORE.equals(metaMapping)) {
if (source == BsonUtils.EMPTY_DOCUMENT) {
source = new Document();
}
MongoPersistentProperty textScoreProperty = entity.getTextScoreProperty();
if (MetaMapping.FORCE.equals(metaMapping)
|| (MetaMapping.WHEN_PRESENT.equals(metaMapping) && source.containsKey(textScoreProperty.getFieldName()))) {
source.putAll(getMappedTextScoreField(textScoreProperty));
}
}
return source;
}
private Document filterUnwrappedObjects(Document fieldsObject, @Nullable MongoPersistentEntity<?> entity) {
@@ -444,6 +464,10 @@ public class QueryMapper {
}
}
if (value == null) {
return null;
}
if (isNestedKeyword(value)) {
return getMappedKeyword(new Keyword((Bson) value), documentField.getPropertyEntity());
}
@@ -663,7 +687,7 @@ public class QueryMapper {
private Entry<String, Object> createMapEntry(String key, @Nullable Object value) {
Assert.hasText(key, "Key must not be null or empty!");
return Collections.singletonMap(key, value).entrySet().iterator().next();
return new AbstractMap.SimpleEntry<>(key, value);
}
private DBRef createDbRefFor(Object source, MongoPersistentProperty property) {
@@ -706,19 +730,19 @@ public class QueryMapper {
* @param candidate
* @return
*/
protected boolean isNestedKeyword(Object candidate) {
protected boolean isNestedKeyword(@Nullable Object candidate) {
if (!(candidate instanceof Document)) {
return false;
}
Set<String> keys = BsonUtils.asMap((Bson) candidate).keySet();
Map<String, Object> map = BsonUtils.asMap((Bson) candidate);
if (keys.size() != 1) {
if (map.size() != 1) {
return false;
}
return isKeyword(keys.iterator().next());
return isKeyword(map.entrySet().iterator().next().getKey());
}
/**
@@ -751,12 +775,14 @@ public class QueryMapper {
* converted one by one.
*
* @param documentField the field and its meta data
* @param value the actual value
* @param value the actual value. Can be {@literal null}.
* @return the potentially converted target value.
*/
private Object applyFieldTargetTypeHintToValue(Field documentField, Object value) {
@Nullable
private Object applyFieldTargetTypeHintToValue(Field documentField, @Nullable Object value) {
if (documentField.getProperty() == null || !documentField.getProperty().hasExplicitWriteTarget()) {
if (value == null || documentField.getProperty() == null || !documentField.getProperty().hasExplicitWriteTarget()
|| value instanceof Document || value instanceof DBObject) {
return value;
}
@@ -787,7 +813,6 @@ public class QueryMapper {
*/
static class Keyword {
private static final String N_OR_PATTERN = "\\$.*or";
private static final Set<String> NON_DBREF_CONVERTING_KEYWORDS = new HashSet<>(
Arrays.asList("$", "$size", "$slice", "$gt", "$lt"));
@@ -801,11 +826,14 @@ public class QueryMapper {
public Keyword(Bson bson) {
Set<String> keys = BsonUtils.asMap(bson).keySet();
Assert.isTrue(keys.size() == 1, "Can only use a single value Document!");
Map<String, Object> map = BsonUtils.asMap(bson);
Assert.isTrue(map.size() == 1, "Can only use a single value Document!");
this.key = keys.iterator().next();
this.value = BsonUtils.get(bson, key);
Set<Entry<String, Object>> entries = map.entrySet();
Entry<String, Object> entry = entries.iterator().next();
this.key = entry.getKey();
this.value = entry.getValue();
}
/**
@@ -818,7 +846,7 @@ public class QueryMapper {
}
public boolean isOrOrNor() {
return key.matches(N_OR_PATTERN);
return key.equalsIgnoreCase("$or") || key.equalsIgnoreCase("$nor");
}
/**
@@ -999,8 +1027,8 @@ public class QueryMapper {
*/
protected static class MetadataBackedField extends Field {
private static final Pattern POSITIONAL_PARAMETER_PATTERN = Pattern.compile("\\.\\$(\\[.*?\\])?|\\.\\d+");
private static final Pattern DOT_POSITIONAL_PATTERN = Pattern.compile("\\.\\d+");
private static final Pattern POSITIONAL_PARAMETER_PATTERN = Pattern.compile("\\.\\$(\\[.*?\\])?");
private static final Pattern DOT_POSITIONAL_PATTERN = Pattern.compile("\\.\\d+(?!$)");
private static final String INVALID_ASSOCIATION_REFERENCE = "Invalid path reference %s! Associations can only be pointed to directly or via their id property!";
private final MongoPersistentEntity<?> entity;
@@ -1169,8 +1197,8 @@ public class QueryMapper {
removePlaceholders(DOT_POSITIONAL_PATTERN, pathExpression));
if (sourceProperty != null && sourceProperty.getOwner().equals(entity)) {
return mappingContext
.getPersistentPropertyPath(PropertyPath.from(Pattern.quote(sourceProperty.getName()), entity.getTypeInformation()));
return mappingContext.getPersistentPropertyPath(
PropertyPath.from(Pattern.quote(sourceProperty.getName()), entity.getTypeInformation()));
}
PropertyPath path = forName(rawPath);
@@ -1178,29 +1206,47 @@ public class QueryMapper {
return null;
}
try {
PersistentPropertyPath<MongoPersistentProperty> propertyPath = tryToResolvePersistentPropertyPath(path);
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
if (propertyPath == null) {
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
boolean associationDetected = false;
if (QueryMapper.LOGGER.isInfoEnabled()) {
while (iterator.hasNext()) {
String types = StringUtils.collectionToDelimitedString(
path.stream().map(it -> it.getType().getSimpleName()).collect(Collectors.toList()), " -> ");
QueryMapper.LOGGER.info(
"Could not map '{}'. Maybe a fragment in '{}' is considered a simple type. Mapper continues with {}.",
path, types, pathExpression);
}
return null;
}
MongoPersistentProperty property = iterator.next();
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
boolean associationDetected = false;
if (property.isAssociation()) {
associationDetected = true;
continue;
}
while (iterator.hasNext()) {
if (associationDetected && !property.isIdProperty()) {
throw new MappingException(String.format(INVALID_ASSOCIATION_REFERENCE, pathExpression));
}
MongoPersistentProperty property = iterator.next();
if (property.isAssociation()) {
associationDetected = true;
continue;
}
return propertyPath;
} catch (InvalidPersistentPropertyPath e) {
if (associationDetected && !property.isIdProperty()) {
throw new MappingException(String.format(INVALID_ASSOCIATION_REFERENCE, pathExpression));
}
}
return propertyPath;
}
@Nullable
private PersistentPropertyPath<MongoPersistentProperty> tryToResolvePersistentPropertyPath(PropertyPath path) {
try {
return mappingContext.getPersistentPropertyPath(path);
} catch (MappingException e) {
return null;
}
}
@@ -1329,12 +1375,17 @@ public class QueryMapper {
static class KeyMapper {
private final Iterator<String> iterator;
private int currentIndex;
private String currentPropertyRoot;
private final List<String> pathParts;
public KeyMapper(String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.iterator = Arrays.asList(key.split("\\.")).iterator();
this.iterator.next();
this.pathParts = Arrays.asList(key.split("\\."));
this.iterator = pathParts.iterator();
this.currentPropertyRoot = iterator.next();
this.currentIndex = 0;
}
/**
@@ -1346,21 +1397,31 @@ public class QueryMapper {
protected String mapPropertyName(MongoPersistentProperty property) {
StringBuilder mappedName = new StringBuilder(PropertyToFieldNameConverter.INSTANCE.convert(property));
boolean inspect = iterator.hasNext();
while (inspect) {
String partial = iterator.next();
currentIndex++;
boolean isPositional = isPositionalParameter(partial) && property.isCollectionLike();
boolean isPositional = isPositionalParameter(partial) && property.isCollectionLike() ;
if(property.isMap() && currentPropertyRoot.equals(partial) && iterator.hasNext()){
partial = iterator.next();
currentIndex++;
}
if (isPositional || property.isMap()) {
if (isPositional || property.isMap() && !currentPropertyRoot.equals(partial)) {
mappedName.append(".").append(partial);
}
inspect = isPositional && iterator.hasNext();
}
if(currentIndex + 1 < pathParts.size()) {
currentIndex++;
currentPropertyRoot = pathParts.get(currentIndex);
}
return mappedName.toString();
}

View File

@@ -16,7 +16,7 @@
package org.springframework.data.mongodb.core.geo;
/**
* Interface definition for structures defined in <a href="https://geojson.org/>GeoJSON</a> format.
* Interface definition for structures defined in <a href="https://geojson.org/">GeoJSON</a> format.
*
* @author Christoph Strobl
* @since 1.7

View File

@@ -28,6 +28,7 @@ import org.springframework.util.ObjectUtils;
* {@link GeoJsonMultiPoint} is defined as list of {@link Point}s.
*
* @author Christoph Strobl
* @author Ivan Volzhev
* @since 1.7
* @see <a href="https://geojson.org/geojson-spec.html#multipoint">https://geojson.org/geojson-spec.html#multipoint</a>
*/
@@ -37,17 +38,31 @@ public class GeoJsonMultiPoint implements GeoJson<Iterable<Point>> {
private final List<Point> points;
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}.
*
* @param point must not be {@literal null}.
* @since 3.2.5
*/
public GeoJsonMultiPoint(Point point) {
Assert.notNull(point, "Point must not be null!");
this.points = new ArrayList<>();
this.points.add(point);
}
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}s.
*
* @param points points must not be {@literal null} and have at least 2 entries.
* @param points points must not be {@literal null} and not empty
*/
public GeoJsonMultiPoint(List<Point> points) {
Assert.notNull(points, "Points must not be null.");
Assert.isTrue(points.size() >= 2, "Minimum of 2 Points required.");
Assert.notNull(points, "Points must not be null!");
Assert.notEmpty(points, "Points must contain at least one point!");
this.points = new ArrayList<Point>(points);
this.points = new ArrayList<>(points);
}
/**
@@ -63,7 +78,7 @@ public class GeoJsonMultiPoint implements GeoJson<Iterable<Point>> {
Assert.notNull(second, "Second point must not be null!");
Assert.notNull(others, "Additional points must not be null!");
this.points = new ArrayList<Point>();
this.points = new ArrayList<>();
this.points.add(first);
this.points.add(second);
this.points.addAll(Arrays.asList(others));

View File

@@ -15,8 +15,11 @@
*/
package org.springframework.data.mongodb.core.mapping;
import org.springframework.util.ObjectUtils;
/**
* @author Christoph Strobl
* @author Rogério Meneguelli Gatto
* @since 3.2
*/
class UnwrapEntityContext {
@@ -30,4 +33,32 @@ class UnwrapEntityContext {
public MongoPersistentProperty getProperty() {
return property;
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
UnwrapEntityContext that = (UnwrapEntityContext) obj;
return ObjectUtils.nullSafeEquals(property, that.property);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(property);
}
}

View File

@@ -24,11 +24,13 @@ import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
/**
* Unwrapped variant of {@link MongoPersistentProperty}.
*
* @author Christoph Strobl
* @author Rogério Meneguelli Gatto
* @since 3.2
* @see Unwrapped
*/
@@ -304,4 +306,38 @@ class UnwrappedMongoPersistentProperty implements MongoPersistentProperty {
public <T> PersistentPropertyAccessor<T> getAccessorForOwner(T owner) {
return delegate.getAccessorForOwner(owner);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
UnwrappedMongoPersistentProperty that = (UnwrappedMongoPersistentProperty) obj;
if (!ObjectUtils.nullSafeEquals(delegate, that.delegate)) {
return false;
}
return ObjectUtils.nullSafeEquals(context, that.context);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(delegate);
result = 31 * result + ObjectUtils.nullSafeHashCode(context);
return result;
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
@@ -192,7 +193,7 @@ public class Field {
*/
public Field slice(String field, int offset, int size) {
slices.put(field, new Integer[] { offset, size });
slices.put(field, Arrays.asList(offset, size));
return this;
}

View File

@@ -49,8 +49,8 @@ public class Meta {
}
}
private final Map<String, Object> values = new LinkedHashMap<>(2);
private final Set<CursorOption> flags = new LinkedHashSet<>();
private Map<String, Object> values = Collections.emptyMap();
private Set<CursorOption> flags = Collections.emptySet();
private Integer cursorBatchSize;
private Boolean allowDiskUse;
@@ -63,8 +63,9 @@ public class Meta {
* @param source
*/
Meta(Meta source) {
this.values.putAll(source.values);
this.flags.addAll(source.flags);
this.values = new LinkedHashMap<>(source.values);
this.flags = new LinkedHashSet<>(source.flags);
this.cursorBatchSize = source.cursorBatchSize;
this.allowDiskUse = source.allowDiskUse;
}
@@ -158,6 +159,11 @@ public class Meta {
public boolean addFlag(CursorOption option) {
Assert.notNull(option, "CursorOption must not be null!");
if (this.flags == Collections.EMPTY_SET) {
this.flags = new LinkedHashSet<>(2);
}
return this.flags.add(option);
}
@@ -220,6 +226,10 @@ public class Meta {
Assert.hasText(key, "Meta key must not be 'null' or blank.");
if (values == Collections.EMPTY_MAP) {
values = new LinkedHashMap<>(2);
}
if (value == null || (value instanceof String && !StringUtils.hasText((String) value))) {
this.values.remove(key);
}

View File

@@ -21,6 +21,7 @@ import static org.springframework.util.ObjectUtils.*;
import java.time.Duration;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
@@ -30,6 +31,7 @@ import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.bson.Document;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
@@ -52,7 +54,7 @@ public class Query {
private static final String RESTRICTED_TYPES_KEY = "_$RESTRICTED_TYPES";
private final Set<Class<?>> restrictedTypes = new HashSet<>();
private Set<Class<?>> restrictedTypes = Collections.emptySet();
private final Map<String, CriteriaDefinition> criteria = new LinkedHashMap<>();
private @Nullable Field fieldSpec = null;
private Sort sort = Sort.unsorted();
@@ -235,8 +237,15 @@ public class Query {
Assert.notNull(type, "Type must not be null!");
Assert.notNull(additionalTypes, "AdditionalTypes must not be null");
if (restrictedTypes == Collections.EMPTY_SET) {
restrictedTypes = new HashSet<>(1 + additionalTypes.length);
}
restrictedTypes.add(type);
restrictedTypes.addAll(Arrays.asList(additionalTypes));
if (additionalTypes.length > 0) {
restrictedTypes.addAll(Arrays.asList(additionalTypes));
}
return this;
}
@@ -246,6 +255,17 @@ public class Query {
*/
public Document getQueryObject() {
if (criteria.isEmpty() && restrictedTypes.isEmpty()) {
return BsonUtils.EMPTY_DOCUMENT;
}
if (criteria.size() == 1 && restrictedTypes.isEmpty()) {
for (CriteriaDefinition definition : criteria.values()) {
return definition.getCriteriaObject();
}
}
Document document = new Document();
for (CriteriaDefinition definition : criteria.values()) {
@@ -263,7 +283,7 @@ public class Query {
* @return the field {@link Document}.
*/
public Document getFieldsObject() {
return this.fieldSpec == null ? new Document() : fieldSpec.getFieldsObject();
return this.fieldSpec == null ? BsonUtils.EMPTY_DOCUMENT : fieldSpec.getFieldsObject();
}
/**
@@ -272,13 +292,12 @@ public class Query {
public Document getSortObject() {
if (this.sort.isUnsorted()) {
return new Document();
return BsonUtils.EMPTY_DOCUMENT;
}
Document document = new Document();
this.sort.stream()//
.forEach(order -> document.put(order.getProperty(), order.isAscending() ? 1 : -1));
this.sort.forEach(order -> document.put(order.getProperty(), order.isAscending() ? 1 : -1));
return document;
}
@@ -557,7 +576,7 @@ public class Query {
target.limit = source.getLimit();
target.hint = source.getHint();
target.collation = source.getCollation();
target.restrictedTypes.addAll(source.getRestrictedTypes());
target.restrictedTypes = new HashSet<>(source.getRestrictedTypes());
if (source.getMeta().hasValues()) {
target.setMeta(new Meta(source.getMeta()));

View File

@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.core.query;
import java.util.Locale;
import org.bson.Document;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
/**
@@ -157,7 +159,7 @@ public class TextQuery extends Query {
return super.getFieldsObject();
}
Document fields = super.getFieldsObject();
Document fields = BsonUtils.asMutableDocument(super.getFieldsObject());
fields.put(getScoreFieldName(), META_TEXT_SCORE);
return fields;
@@ -170,15 +172,14 @@ public class TextQuery extends Query {
@Override
public Document getSortObject() {
Document sort = new Document();
if (this.sortByScore) {
Document sort = new Document();
sort.put(getScoreFieldName(), META_TEXT_SCORE);
sort.putAll(super.getSortObject());
return sort;
}
sort.putAll(super.getSortObject());
return sort;
return super.getSortObject();
}
/*

View File

@@ -56,10 +56,10 @@ public class Update implements UpdateDefinition {
}
private boolean isolated = false;
private Set<String> keysToUpdate = new HashSet<>();
private Map<String, Object> modifierOps = new LinkedHashMap<>();
private Map<String, PushOperatorBuilder> pushCommandBuilders = new LinkedHashMap<>(1);
private List<ArrayFilter> arrayFilters = new ArrayList<>();
private final Set<String> keysToUpdate = new HashSet<>();
private final Map<String, Object> modifierOps = new LinkedHashMap<>();
private Map<String, PushOperatorBuilder> pushCommandBuilders = Collections.emptyMap();
private List<ArrayFilter> arrayFilters = Collections.emptyList();
/**
* Static factory method to create an Update using the provided key
@@ -193,6 +193,11 @@ public class Update implements UpdateDefinition {
public PushOperatorBuilder push(String key) {
if (!pushCommandBuilders.containsKey(key)) {
if (pushCommandBuilders == Collections.EMPTY_MAP) {
pushCommandBuilders = new LinkedHashMap<>(1);
}
pushCommandBuilders.put(key, new PushOperatorBuilder(key));
}
return pushCommandBuilders.get(key);
@@ -412,6 +417,10 @@ public class Update implements UpdateDefinition {
*/
public Update filterArray(CriteriaDefinition criteria) {
if (arrayFilters == Collections.EMPTY_LIST) {
this.arrayFilters = new ArrayList<>();
}
this.arrayFilters.add(criteria::getCriteriaObject);
return this;
}
@@ -427,6 +436,10 @@ public class Update implements UpdateDefinition {
*/
public Update filterArray(String identifier, Object expression) {
if (arrayFilters == Collections.EMPTY_LIST) {
this.arrayFilters = new ArrayList<>();
}
this.arrayFilters.add(() -> new Document(identifier, expression));
return this;
}
@@ -455,6 +468,15 @@ public class Update implements UpdateDefinition {
return Collections.unmodifiableList(this.arrayFilters);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#hasArrayFilters()
*/
@Override
public boolean hasArrayFilters() {
return !this.arrayFilters.isEmpty();
}
/**
* This method is not called anymore rather override {@link #addMultiFieldOperation(String, String, Object)}.
*

View File

@@ -39,7 +39,6 @@ import org.springframework.data.repository.core.RepositoryInformation;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.core.support.RepositoryComposition.RepositoryFragments;
import org.springframework.data.repository.core.support.RepositoryFactorySupport;
import org.springframework.data.repository.core.support.RepositoryFragment;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
@@ -92,8 +91,21 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
*/
@Override
protected RepositoryFragments getRepositoryFragments(RepositoryMetadata metadata) {
return getRepositoryFragments(metadata, operations);
}
RepositoryFragments fragments = RepositoryFragments.empty();
/**
* Creates {@link RepositoryFragments} based on {@link RepositoryMetadata} to add Mongo-specific extensions. Typically
* adds a {@link QuerydslMongoPredicateExecutor} if the repository interface uses Querydsl.
* <p>
* Can be overridden by subclasses to customize {@link RepositoryFragments}.
*
* @param metadata repository metadata.
* @param operations the MongoDB operations manager.
* @return
* @since 3.2.1
*/
protected RepositoryFragments getRepositoryFragments(RepositoryMetadata metadata, MongoOperations operations) {
boolean isQueryDslRepository = QUERY_DSL_PRESENT
&& QuerydslPredicateExecutor.class.isAssignableFrom(metadata.getRepositoryInterface());
@@ -105,14 +117,11 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
"Cannot combine Querydsl and reactive repository support in a single interface");
}
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(metadata.getDomainType(),
metadata);
fragments = fragments.append(RepositoryFragment.implemented(
getTargetRepositoryViaReflection(QuerydslMongoPredicateExecutor.class, entityInformation, operations)));
return RepositoryFragments
.just(new QuerydslMongoPredicateExecutor<>(getEntityInformation(metadata.getDomainType()), operations));
}
return fragments;
return RepositoryFragments.empty();
}
/*

View File

@@ -70,7 +70,7 @@ public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID exten
* @see org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport#setMappingContext(org.springframework.data.mapping.context.MappingContext)
*/
@Override
protected void setMappingContext(MappingContext<?, ?> mappingContext) {
public void setMappingContext(MappingContext<?, ?> mappingContext) {
super.setMappingContext(mappingContext);
this.mappingContextConfigured = true;

View File

@@ -212,6 +212,10 @@ public class QuerydslMongoPredicateExecutor<T> extends QuerydslPredicateExecutor
*/
private SpringDataMongodbQuery<T> applyPagination(SpringDataMongodbQuery<T> query, Pageable pageable) {
if (pageable.isUnpaged()) {
return query;
}
query = query.offset(pageable.getOffset()).limit(pageable.getPageSize());
return applySorting(query, pageable.getSort());
}

View File

@@ -80,7 +80,7 @@ public class ReactiveMongoRepositoryFactoryBean<T extends Repository<S, ID>, S,
* @see org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport#setMappingContext(org.springframework.data.mapping.context.MappingContext)
*/
@Override
protected void setMappingContext(MappingContext<?, ?> mappingContext) {
public void setMappingContext(MappingContext<?, ?> mappingContext) {
super.setMappingContext(mappingContext);
this.mappingContextConfigured = true;

View File

@@ -60,12 +60,26 @@ import com.mongodb.MongoClientSettings;
*/
public class BsonUtils {
/**
* The empty document (immutable). This document is serializable.
*
* @since 3.2.5
*/
public static final Document EMPTY_DOCUMENT = new EmptyDocument();
@SuppressWarnings("unchecked")
@Nullable
public static <T> T get(Bson bson, String key) {
return (T) asMap(bson).get(key);
}
/**
* Return the {@link Bson} object as {@link Map}. Depending on the input type, the return value can be either a casted
* version of {@code bson} or a converted (detached from the original value).
*
* @param bson
* @return
*/
public static Map<String, Object> asMap(Bson bson) {
if (bson instanceof Document) {
@@ -81,6 +95,55 @@ public class BsonUtils {
return (Map) bson.toBsonDocument(Document.class, MongoClientSettings.getDefaultCodecRegistry());
}
/**
* Return the {@link Bson} object as {@link Document}. Depending on the input type, the return value can be either a
* casted version of {@code bson} or a converted (detached from the original value).
*
* @param bson
* @return
* @since 3.2.5
*/
public static Document asDocument(Bson bson) {
if (bson instanceof Document) {
return (Document) bson;
}
Map<String, Object> map = asMap(bson);
if (map instanceof Document) {
return (Document) map;
}
return new Document(map);
}
/**
* Return the {@link Bson} object as mutable {@link Document} containing all entries from {@link Bson}.
*
* @param bson
* @return a mutable {@link Document} containing all entries from {@link Bson}.
* @since 3.2.5
*/
public static Document asMutableDocument(Bson bson) {
if (bson instanceof EmptyDocument) {
bson = new Document(asDocument(bson));
}
if (bson instanceof Document) {
return (Document) bson;
}
Map<String, Object> map = asMap(bson);
if (map instanceof Document) {
return (Document) map;
}
return new Document(map);
}
public static void addToMap(Bson bson, String key, @Nullable Object value) {
if (bson instanceof Document) {
@@ -488,6 +551,49 @@ public class BsonUtils {
return null;
}
/**
* Returns the given source object as {@link Bson}, i.e. {@link Document}s and maps as is or throw
* {@link IllegalArgumentException}.
*
* @param source
* @return the converted/casted source object.
* @throws IllegalArgumentException if {@code source} cannot be converted/cast to {@link Bson}.
* @since 3.2.3
* @see #supportsBson(Object)
*/
@SuppressWarnings("unchecked")
public static Bson asBson(Object source) {
if (source instanceof Document) {
return (Document) source;
}
if (source instanceof BasicDBObject) {
return (BasicDBObject) source;
}
if (source instanceof DBObject) {
return new Document(((DBObject) source).toMap());
}
if (source instanceof Map) {
return new Document((Map<String, Object>) source);
}
throw new IllegalArgumentException(String.format("Cannot convert %s to Bson", source));
}
/**
* Returns the given source can be used/converted as {@link Bson}.
*
* @param source
* @return {@literal true} if the given source can be converted to {@link Bson}.
* @since 3.2.3
*/
public static boolean supportsBson(Object source) {
return source instanceof DBObject || source instanceof Map;
}
/**
* Returns given object as {@link Collection}. Will return the {@link Collection} as is if the source is a
* {@link Collection} already, will convert an array into a {@link Collection} or simply create a single element

View File

@@ -0,0 +1,95 @@
/*
* Copyright 2021 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.util;
import java.util.Collection;
import java.util.Collections;
import java.util.Map;
import java.util.Set;
import java.util.function.BiFunction;
import org.bson.Document;
import org.jetbrains.annotations.Nullable;
/**
* Empty variant of {@link Document}.
*
* @author Mark Paluch
*/
class EmptyDocument extends Document {
@Override
public Document append(String key, Object value) {
throw new UnsupportedOperationException();
}
@Override
public Object put(String key, Object value) {
throw new UnsupportedOperationException();
}
@Override
public Object remove(Object key) {
throw new UnsupportedOperationException();
}
@Override
public void putAll(Map<? extends String, ?> map) {
throw new UnsupportedOperationException();
}
@Override
public void replaceAll(BiFunction<? super String, ? super Object, ?> function) {
throw new UnsupportedOperationException();
}
@Override
public boolean remove(Object key, Object value) {
throw new UnsupportedOperationException();
}
@Override
public boolean replace(String key, Object oldValue, Object newValue) {
throw new UnsupportedOperationException();
}
@Nullable
@Override
public Object replace(String key, Object value) {
throw new UnsupportedOperationException();
}
@Override
public Set<Entry<String, Object>> entrySet() {
return Collections.emptySet();
}
@Override
public Collection<Object> values() {
return Collections.emptyList();
}
@Override
public Set<String> keySet() {
return Collections.emptySet();
}
@Override
public void clear() {
throw new UnsupportedOperationException();
}
}

View File

@@ -364,6 +364,28 @@ infix fun KProperty<GeoJson<*>>.maxDistance(d: Double): Criteria =
infix fun KProperty<GeoJson<*>>.minDistance(d: Double): Criteria =
Criteria(asString(this)).minDistance(d)
/**
* Creates a geo-spatial criterion using a $maxDistance operation, for use with $near
*
* See [MongoDB Query operator:
* $maxDistance](https://docs.mongodb.com/manual/reference/operator/query/maxDistance/)
* @author Sangyong Choi
* @since 3.2.5
* @see Criteria.maxDistance
*/
infix fun Criteria.maxDistance(d: Double): Criteria =
this.maxDistance(d)
/**
* Creates a geospatial criterion using a $minDistance operation, for use with $near or
* $nearSphere.
* @author Sangyong Choi
* @since 3.2.5
* @see Criteria.minDistance
*/
infix fun Criteria.minDistance(d: Double): Criteria =
this.minDistance(d)
/**
* Creates a criterion using the $elemMatch operator
*

View File

@@ -109,6 +109,30 @@ class MongoDatabaseUtilsUnitTests {
verify(dbFactory, never()).withSession(any(ClientSession.class));
}
@Test // GH-3760
void shouldJustReturnDatabaseIfSessionSynchronizationDisabled() throws Exception {
when(dbFactory.getMongoDatabase()).thenReturn(db);
JtaTransactionManager txManager = new JtaTransactionManager(userTransaction);
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus transactionStatus) {
MongoDatabaseUtils.getDatabase(dbFactory, SessionSynchronization.NEVER);
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isFalse();
}
});
verify(userTransaction).getStatus();
verifyNoMoreInteractions(userTransaction);
verifyNoInteractions(session);
}
@Test // DATAMONGO-1920
void shouldParticipateInOngoingJtaTransactionWithCommitWhenSessionSychronizationIsAny() throws Exception {

View File

@@ -88,6 +88,20 @@ class ReactiveMongoDatabaseUtilsUnitTests {
}).as(StepVerifier::create).expectNext(true).verifyComplete();
}
@Test // GH-3760
void shouldJustReturnDatabaseIfSessionSynchronizationDisabled() {
when(databaseFactory.getMongoDatabase()).thenReturn(Mono.just(db));
ReactiveMongoDatabaseUtils.getDatabase(databaseFactory, SessionSynchronization.NEVER) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
verify(databaseFactory, never()).getSession(any());
verify(databaseFactory, never()).withSession(any(ClientSession.class));
}
@Test // DATAMONGO-2265
void shouldNotStartSessionWhenNoTransactionOngoing() {

View File

@@ -3703,6 +3703,23 @@ public class MongoTemplateTests {
assertThat(template.find(new BasicQuery("{}").with(Sort.by("id")), WithIdAndFieldAnnotation.class)).isNotEmpty();
}
@Test // GH-3811
public void sliceShouldLimitCollectionValues() {
DocumentWithCollectionOfSimpleType source = new DocumentWithCollectionOfSimpleType();
source.id = "id-1";
source.values = Arrays.asList("spring", "data", "mongodb");
template.save(source);
Criteria criteria = Criteria.where("id").is(source.id);
Query query = Query.query(criteria);
query.fields().slice("values", 0, 1);
DocumentWithCollectionOfSimpleType target = template.findOne(query, DocumentWithCollectionOfSimpleType.class);
assertThat(target.values).containsExactly("spring");
}
private AtomicReference<ImmutableVersioned> createAfterSaveReference() {
AtomicReference<ImmutableVersioned> saved = new AtomicReference<>();

View File

@@ -95,6 +95,7 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.test.util.ReflectionTestUtils;
import org.springframework.util.CollectionUtils;
@@ -1050,7 +1051,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonSpELProjection.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document()));
verify(findIterable).projection(eq(BsonUtils.EMPTY_DOCUMENT));
}
@Test // DATAMONGO-1733, DATAMONGO-2041
@@ -1077,7 +1078,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
template.doFind("star-wars", new Document(), new Document(), Person.class, Person.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document()));
verify(findIterable).projection(eq(BsonUtils.EMPTY_DOCUMENT));
}
@Test // DATAMONGO-1733
@@ -1086,7 +1087,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
template.doFind("star-wars", new Document(), new Document(), Person.class, PersonExtended.class,
CursorPreparer.NO_OP_PREPARER);
verify(findIterable).projection(eq(new Document()));
verify(findIterable).projection(eq(BsonUtils.EMPTY_DOCUMENT));
}
@Test // DATAMONGO-1348, DATAMONGO-2264

View File

@@ -30,7 +30,7 @@ import java.time.LocalDateTime;
import java.time.temporal.ChronoUnit;
import java.util.*;
import org.assertj.core.api.Assertions;
import org.bson.types.Binary;
import org.bson.types.Code;
import org.bson.types.Decimal128;
import org.bson.types.ObjectId;
@@ -42,6 +42,7 @@ import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.beans.ConversionNotSupportedException;
import org.springframework.beans.factory.annotation.Value;
@@ -81,6 +82,8 @@ import org.springframework.data.mongodb.core.mapping.TextScore;
import org.springframework.data.mongodb.core.mapping.Unwrapped;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertCallback;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.lang.NonNull;
import org.springframework.lang.Nullable;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.BasicDBList;
@@ -527,7 +530,7 @@ class MappingMongoConverterUnitTests {
}
@Test
public void convertsObjectsIfNecessary() {
void convertsObjectsIfNecessary() {
ObjectId id = new ObjectId();
assertThat(converter.convertToMongoType(id)).isEqualTo(id);
@@ -931,10 +934,11 @@ class MappingMongoConverterUnitTests {
assertThat(readResult.iterator().next()).isInstanceOf(Address.class);
}
@Test // DATAMONGO-402
@Test // DATAMONGO-402, GH-3702
void readsMemberClassCorrectly() {
org.bson.Document document = new org.bson.Document("inner", new org.bson.Document("value", "FOO!"));
org.bson.Document document = new org.bson.Document("inner",
new LinkedHashMap<>(new org.bson.Document("value", "FOO!")));
Outer outer = converter.read(Outer.class, document);
assertThat(outer.inner).isNotNull();
@@ -2111,21 +2115,21 @@ class MappingMongoConverterUnitTests {
}
@Test // DATAMONGO-2479
public void entityCallbacksAreNotSetByDefault() {
Assertions.assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isNull();
void entityCallbacksAreNotSetByDefault() {
assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isNull();
}
@Test // DATAMONGO-2479
public void entityCallbacksShouldBeInitiatedOnSettingApplicationContext() {
void entityCallbacksShouldBeInitiatedOnSettingApplicationContext() {
ApplicationContext ctx = new StaticApplicationContext();
converter.setApplicationContext(ctx);
Assertions.assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isNotNull();
assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isNotNull();
}
@Test // DATAMONGO-2479
public void setterForEntityCallbackOverridesContextInitializedOnes() {
void setterForEntityCallbackOverridesContextInitializedOnes() {
ApplicationContext ctx = new StaticApplicationContext();
converter.setApplicationContext(ctx);
@@ -2133,11 +2137,11 @@ class MappingMongoConverterUnitTests {
EntityCallbacks callbacks = EntityCallbacks.create();
converter.setEntityCallbacks(callbacks);
Assertions.assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isSameAs(callbacks);
assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isSameAs(callbacks);
}
@Test // DATAMONGO-2479
public void setterForApplicationContextShouldNotOverrideAlreadySetEntityCallbacks() {
void setterForApplicationContextShouldNotOverrideAlreadySetEntityCallbacks() {
EntityCallbacks callbacks = EntityCallbacks.create();
ApplicationContext ctx = new StaticApplicationContext();
@@ -2145,11 +2149,11 @@ class MappingMongoConverterUnitTests {
converter.setEntityCallbacks(callbacks);
converter.setApplicationContext(ctx);
Assertions.assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isSameAs(callbacks);
assertThat(ReflectionTestUtils.getField(converter, "entityCallbacks")).isSameAs(callbacks);
}
@Test // DATAMONGO-2479
public void resolveDBRefMapValueShouldInvokeCallbacks() {
void resolveDBRefMapValueShouldInvokeCallbacks() {
AfterConvertCallback<Person> afterConvertCallback = spy(new ReturningAfterConvertCallback());
converter.setEntityCallbacks(EntityCallbacks.create(afterConvertCallback));
@@ -2166,7 +2170,7 @@ class MappingMongoConverterUnitTests {
}
@Test // DATAMONGO-2300
public void readAndConvertDBRefNestedByMapCorrectly() {
void readAndConvertDBRefNestedByMapCorrectly() {
org.bson.Document cluster = new org.bson.Document("_id", 100L);
DBRef dbRef = new DBRef("clusters", 100L);
@@ -2428,6 +2432,178 @@ class MappingMongoConverterUnitTests {
verify(subTypeOfGenericTypeConverter).convert(eq(source));
}
@Test // GH-3660
void usesCustomConverterForMapTypesOnWrite() {
converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(it -> {
it.registerConverter(new TypeImplementingMapToDocumentConverter());
}));
converter.afterPropertiesSet();
TypeImplementingMap source = new TypeImplementingMap("one", 2);
org.bson.Document target = new org.bson.Document();
converter.write(source, target);
assertThat(target).containsEntry("1st", "one").containsEntry("2nd", 2);
}
@Test // GH-3660
void usesCustomConverterForTypesImplementingMapOnWrite() {
converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(it -> {
it.registerConverter(new TypeImplementingMapToDocumentConverter());
}));
converter.afterPropertiesSet();
TypeImplementingMap source = new TypeImplementingMap("one", 2);
org.bson.Document target = new org.bson.Document();
converter.write(source, target);
assertThat(target).containsEntry("1st", "one").containsEntry("2nd", 2);
}
@Test // GH-3660
void usesCustomConverterForTypesImplementingMapOnRead() {
converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(it -> {
it.registerConverter(new DocumentToTypeImplementingMapConverter());
}));
converter.afterPropertiesSet();
org.bson.Document source = new org.bson.Document("1st", "one")
.append("2nd", 2)
.append("_class", TypeImplementingMap.class.getName());
TypeImplementingMap target = converter.read(TypeImplementingMap.class, source);
assertThat(target).isEqualTo(new TypeImplementingMap("one", 2));
}
@Test // GH-3660
void usesCustomConverterForPropertiesUsingTypesThatImplementMapOnWrite() {
converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(it -> {
it.registerConverter(new TypeImplementingMapToDocumentConverter());
}));
converter.afterPropertiesSet();
TypeWrappingTypeImplementingMap source = new TypeWrappingTypeImplementingMap();
source.typeImplementingMap = new TypeImplementingMap("one", 2);
org.bson.Document target = new org.bson.Document();
converter.write(source, target);
assertThat(target).containsEntry("typeImplementingMap", new org.bson.Document("1st", "one").append("2nd", 2));
}
@Test // GH-3660
void usesCustomConverterForPropertiesUsingTypesImplementingMapOnRead() {
converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(it -> {
it.registerConverter(new DocumentToTypeImplementingMapConverter());
}));
converter.afterPropertiesSet();
org.bson.Document source = new org.bson.Document("typeImplementingMap",
new org.bson.Document("1st", "one")
.append("2nd", 2))
.append("_class", TypeWrappingTypeImplementingMap.class.getName());
TypeWrappingTypeImplementingMap target = converter.read(TypeWrappingTypeImplementingMap.class, source);
assertThat(target.typeImplementingMap).isEqualTo(new TypeImplementingMap("one", 2));
}
@Test // GH-3686
void readsCollectionContainingNullValue() {
org.bson.Document source = new org.bson.Document("items", Arrays.asList(new org.bson.Document("itemKey", "i1"), null, new org.bson.Document("itemKey", "i3")));
Order target = converter.read(Order.class, source);
assertThat(target.items)
.map(it -> it != null ? it.itemKey : null)
.containsExactly("i1", null, "i3");
}
@Test // GH-3686
void readsArrayContainingNullValue() {
org.bson.Document source = new org.bson.Document("arrayOfStrings", Arrays.asList("i1", null, "i3"));
WithArrays target = converter.read(WithArrays.class, source);
assertThat(target.arrayOfStrings).containsExactly("i1", null, "i3");
}
@Test // GH-3686
void readsMapContainingNullValue() {
org.bson.Document source = new org.bson.Document("mapOfObjects", new org.bson.Document("item1", "i1").append("item2", null).append("item3", "i3"));
ClassWithMapProperty target = converter.read(ClassWithMapProperty.class, source);
assertThat(target.mapOfObjects)
.containsEntry("item1", "i1")
.containsEntry("item2", null)
.containsEntry("item3", "i3");
}
@Test // GH-3670
void appliesCustomConverterEvenToSimpleTypes() {
converter = new MappingMongoConverter(resolver, mappingContext);
converter.setCustomConversions(MongoCustomConversions.create(it -> {
it.registerConverter(new MongoSimpleTypeConverter());
}));
converter.afterPropertiesSet();
org.bson.Document source = new org.bson.Document("content", new Binary(new byte[] {0x00, 0x42}));
GenericType<Object> target = converter.read(GenericType.class, source);
assertThat(target.content).isInstanceOf(byte[].class);
}
@Test // GH-3702
void readsRawDocument() {
org.bson.Document source = new org.bson.Document("_id", "id-1").append("raw", new org.bson.Document("simple", 1).append("document", new org.bson.Document("inner-doc", 1)));
WithRawDocumentProperties target = converter.read(WithRawDocumentProperties.class, source);
assertThat(target.raw).isInstanceOf(org.bson.Document.class).isEqualTo( new org.bson.Document("simple", 1).append("document", new org.bson.Document("inner-doc", 1)));
}
@Test // GH-3702
void readsListOfRawDocument() {
org.bson.Document source = new org.bson.Document("_id", "id-1").append("listOfRaw", Arrays.asList(new org.bson.Document("simple", 1).append("document", new org.bson.Document("inner-doc", 1))));
WithRawDocumentProperties target = converter.read(WithRawDocumentProperties.class, source);
assertThat(target.listOfRaw)
.containsExactly(new org.bson.Document("simple", 1).append("document", new org.bson.Document("inner-doc", 1)));
}
@Test // GH-3692
void readsMapThatDoesNotComeAsDocument() {
org.bson.Document source = new org.bson.Document("_id", "id-1").append("mapOfObjects", Collections.singletonMap("simple", 1));
ClassWithMapProperty target = converter.read(ClassWithMapProperty.class, source);
assertThat(target.mapOfObjects).containsEntry("simple",1);
}
static class GenericType<T> {
T content;
}
@@ -2789,6 +2965,10 @@ class MappingMongoConverterUnitTests {
}
static class WithArrays {
String[] arrayOfStrings;
}
// DATAMONGO-1898
// DATACMNS-1278
@@ -2971,4 +3151,122 @@ class MappingMongoConverterUnitTests {
return target;
}
}
@WritingConverter
static class TypeImplementingMapToDocumentConverter implements Converter<TypeImplementingMap, org.bson.Document> {
@Nullable
@Override
public org.bson.Document convert(TypeImplementingMap source) {
return new org.bson.Document("1st", source.val1).append("2nd", source.val2);
}
}
@ReadingConverter
static class DocumentToTypeImplementingMapConverter implements Converter<org.bson.Document, TypeImplementingMap> {
@Nullable
@Override
public TypeImplementingMap convert(org.bson.Document source) {
return new TypeImplementingMap(source.getString("1st"), source.getInteger("2nd"));
}
}
@ReadingConverter
public static class MongoSimpleTypeConverter implements Converter<Binary, Object> {
@Override
public byte[] convert(Binary source) {
return source.getData();
}
}
static class TypeWrappingTypeImplementingMap {
String id;
TypeImplementingMap typeImplementingMap;
}
@EqualsAndHashCode
static class TypeImplementingMap implements Map<String,String> {
String val1;
int val2;
TypeImplementingMap(String val1, int val2) {
this.val1 = val1;
this.val2 = val2;
}
@Override
public int size() {
return 0;
}
@Override
public boolean isEmpty() {
return false;
}
@Override
public boolean containsKey(Object key) {
return false;
}
@Override
public boolean containsValue(Object value) {
return false;
}
@Override
public String get(Object key) {
return null;
}
@Nullable
@Override
public String put(String key, String value) {
return null;
}
@Override
public String remove(Object key) {
return null;
}
@Override
public void putAll(@NonNull Map<? extends String, ? extends String> m) {
}
@Override
public void clear() {
}
@NonNull
@Override
public Set<String> keySet() {
return null;
}
@NonNull
@Override
public Collection<String> values() {
return null;
}
@NonNull
@Override
public Set<Entry<String, String>> entrySet() {
return null;
}
}
static class WithRawDocumentProperties {
String id;
org.bson.Document raw;
List<org.bson.Document> listOfRaw;
}
}

View File

@@ -33,11 +33,11 @@ import org.bson.types.Code;
import org.bson.types.ObjectId;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.Mock;
import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.Transient;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.geo.Point;
@@ -50,6 +50,7 @@ import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.FieldType;
import org.springframework.data.mongodb.core.mapping.MongoId;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.TextScore;
@@ -58,6 +59,7 @@ import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.TextQuery;
import org.springframework.data.mongodb.core.query.Update;
import com.mongodb.BasicDBObject;
import com.mongodb.MongoClientSettings;
@@ -71,22 +73,23 @@ import com.mongodb.client.model.Filters;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author David Julia
*/
@ExtendWith(MockitoExtension.class)
public class QueryMapperUnitTests {
private QueryMapper mapper;
private MongoMappingContext context;
private MappingMongoConverter converter;
@Mock MongoDatabaseFactory factory;
@BeforeEach
void beforeEach() {
MongoCustomConversions conversions = new MongoCustomConversions();
this.context = new MongoMappingContext();
this.context.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
this.converter = new MappingMongoConverter(new DefaultDbRefResolver(factory), context);
this.converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, context);
this.converter.setCustomConversions(conversions);
this.converter.afterPropertiesSet();
this.mapper = new QueryMapper(converter);
@@ -732,6 +735,28 @@ public class QueryMapperUnitTests {
assertThat(document).containsKey("map.1.stringProperty");
}
@Test // GH-3688
void mappingShouldRetainNestedNumericMapKeys() {
Query query = query(where("outerMap.1.map.2.stringProperty").is("ba'alzamon"));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(EntityWithIntKeyedMapOfMap.class));
assertThat(document).containsKey("outerMap.1.map.2.stringProperty");
}
@Test // GH-3688
void mappingShouldAllowSettingEntireNestedNumericKeyedMapValue() {
Query query = query(where("outerMap.1.map").is(null)); //newEntityWithComplexValueTypeMap()
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(EntityWithIntKeyedMapOfMap.class));
assertThat(document).containsKey("outerMap.1.map");
}
@Test // DATAMONGO-1269
void mappingShouldRetainNumericPositionInList() {
@@ -1257,6 +1282,71 @@ public class QueryMapperUnitTests {
assertThat(document).isEqualTo(new org.bson.Document("double_underscore.renamed", new org.bson.Document("$exists", true)));
}
@Test // GH-3633
void mapsNullValueForFieldWithCustomTargetType() {
Query query = query(where("stringAsOid").is(null));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(NonIdFieldWithObjectIdTargetType.class));
assertThat(document).isEqualTo(new org.bson.Document("stringAsOid", null));
}
@Test // GH-3635
void $floorKeywordDoesNotMatch$or$norPattern() {
Query query = new BasicQuery(" { $expr: { $gt: [ \"$spent\" , { $floor : \"$budget\" } ] } }");
assertThatNoException()
.isThrownBy(() -> mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Foo.class)));
}
@Test // GH-3659
void allowsUsingFieldPathsForPropertiesHavingCustomConversionRegistered() {
Query query = query(where("address.street").is("1007 Mountain Drive"));
MongoCustomConversions mongoCustomConversions = new MongoCustomConversions(Collections.singletonList(new MyAddressToDocumentConverter()));
this.context = new MongoMappingContext();
this.context.setSimpleTypeHolder(mongoCustomConversions.getSimpleTypeHolder());
this.context.afterPropertiesSet();
this.converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, context);
this.converter.setCustomConversions(mongoCustomConversions);
this.converter.afterPropertiesSet();
this.mapper = new QueryMapper(converter);
assertThat(mapper.getMappedSort(query.getQueryObject(), context.getPersistentEntity(Customer.class))).isEqualTo(new org.bson.Document("address.street", "1007 Mountain Drive"));
}
@Test // GH-3668
void mapStringIdFieldProjection() {
org.bson.Document mappedFields = mapper.getMappedFields(new org.bson.Document("id", 1), context.getPersistentEntity(WithStringId.class));
assertThat(mappedFields).containsEntry("_id", 1);
}
@Test // GH-3783
void retainsId$InWithStringArray() {
org.bson.Document mappedQuery = mapper.getMappedObject(
org.bson.Document.parse("{ _id : { $in: [\"5b8bedceb1e0bfc07b008828\"]}}"),
context.getPersistentEntity(WithExplicitStringId.class));
assertThat(mappedQuery.get("_id")).isEqualTo(org.bson.Document.parse("{ $in: [\"5b8bedceb1e0bfc07b008828\"]}"));
}
@Test // GH-3783
void mapsId$InInToObjectIds() {
org.bson.Document mappedQuery = mapper.getMappedObject(
org.bson.Document.parse("{ _id : { $in: [\"5b8bedceb1e0bfc07b008828\"]}}"),
context.getPersistentEntity(ClassWithDefaultId.class));
assertThat(mappedQuery.get("_id"))
.isEqualTo(org.bson.Document.parse("{ $in: [ {$oid: \"5b8bedceb1e0bfc07b008828\" } ]}"));
}
class WithDeepArrayNesting {
List<WithNestedArray> level0;
@@ -1320,6 +1410,18 @@ public class QueryMapperUnitTests {
@Id private String foo;
}
class WithStringId {
@MongoId String id;
String name;
}
class WithExplicitStringId {
@MongoId(FieldType.STRING) String id;
String name;
}
class BigIntegerId {
@Id private BigInteger id;
@@ -1396,18 +1498,22 @@ public class QueryMapperUnitTests {
@Field("geoJsonPointWithNameViaFieldAnnotation") GeoJsonPoint namedGeoJsonPoint;
}
static class SimpeEntityWithoutId {
static class SimpleEntityWithoutId {
String stringProperty;
Integer integerProperty;
}
static class EntityWithComplexValueTypeMap {
Map<Integer, SimpeEntityWithoutId> map;
Map<Integer, SimpleEntityWithoutId> map;
}
static class EntityWithIntKeyedMapOfMap{
Map<Integer, EntityWithComplexValueTypeMap> outerMap;
}
static class EntityWithComplexValueTypeList {
List<SimpeEntityWithoutId> list;
List<SimpleEntityWithoutId> list;
}
static class WithExplicitTargetTypes {
@@ -1487,4 +1593,28 @@ public class QueryMapperUnitTests {
@Field("renamed")
String renamed_fieldname_with_underscores;
}
@Document
static class Customer {
@Id
private ObjectId id;
private String name;
private MyAddress address;
}
static class MyAddress {
private String street;
}
@WritingConverter
public static class MyAddressToDocumentConverter implements Converter<MyAddress, org.bson.Document> {
@Override
public org.bson.Document convert(MyAddress address) {
org.bson.Document doc = new org.bson.Document();
doc.put("street", address.street);
return doc;
}
}
}

View File

@@ -66,6 +66,8 @@ import com.mongodb.DBRef;
* @author Thomas Darimont
* @author Mark Paluch
* @author Pavel Vodrazka
* @author David Julia
* @author Divya Srivastava
*/
@ExtendWith(MockitoExtension.class)
class UpdateMapperUnitTests {
@@ -1179,6 +1181,16 @@ class UpdateMapperUnitTests {
assertThat(mappedUpdate).isEqualTo("{\"$set\": {\"map.601218778970110001827396.value\": \"testing\"}}");
}
@Test // GH-3688
void multipleNumericKeysInNestedPath() {
Update update = new Update().set("intKeyedMap.12345.map.0", "testing");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithIntKeyedMap.class));
assertThat(mappedUpdate).isEqualTo("{\"$set\": {\"intKeyedMap.12345.map.0\": \"testing\"}}");
}
@Test // GH-3566
void mapsObjectClassPropertyFieldInMapValueTypeAsKey() {
@@ -1189,6 +1201,56 @@ class UpdateMapperUnitTests {
assertThat(mappedUpdate).isEqualTo("{\"$set\": {\"map.class\": \"value\"}}");
}
@Test // GH-3775
void mapNestedStringFieldCorrectly() {
Update update = new Update().set("levelOne.a.b.d", "e");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithNestedMap.class));
assertThat(mappedUpdate).isEqualTo(new org.bson.Document("$set",new org.bson.Document("levelOne.a.b.d","e")));
}
@Test // GH-3775
void mapNestedIntegerFieldCorrectly() {
Update update = new Update().set("levelOne.0.1.3", "4");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithNestedMap.class));
assertThat(mappedUpdate).isEqualTo(new org.bson.Document("$set",new org.bson.Document("levelOne.0.1.3","4")));
}
@Test // GH-3775
void mapNestedMixedStringIntegerFieldCorrectly() {
Update update = new Update().set("levelOne.0.1.c", "4");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithNestedMap.class));
assertThat(mappedUpdate).isEqualTo(new org.bson.Document("$set",new org.bson.Document("levelOne.0.1.c","4")));
}
@Test // GH-3775
void mapNestedMixedStringIntegerWithStartNumberFieldCorrectly() {
Update update = new Update().set("levelOne.0a.1b.3c", "4");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithNestedMap.class));
assertThat(mappedUpdate).isEqualTo(new org.bson.Document("$set",new org.bson.Document("levelOne.0a.1b.3c","4")));
}
@Test // GH-3688
void multipleKeysStartingWithANumberInNestedPath() {
Update update = new Update().set("intKeyedMap.1a.map.0b", "testing");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithIntKeyedMap.class));
assertThat(mappedUpdate).isEqualTo("{\"$set\": {\"intKeyedMap.1a.map.0b\": \"testing\"}}");
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}
@@ -1425,6 +1487,10 @@ class UpdateMapperUnitTests {
Map<Object, NestedDocument> concreteMap;
}
static class EntityWithIntKeyedMap{
Map<Integer, EntityWithObjectMap> intKeyedMap;
}
static class ClassWithEnum {
Allocation allocation;
@@ -1551,4 +1617,8 @@ class UpdateMapperUnitTests {
String transientValue;
}
static class EntityWithNestedMap {
Map<String, Map<String, Map<String, Object>>> levelOne;
}
}

View File

@@ -63,6 +63,7 @@ import com.mongodb.client.MongoCollection;
/**
* @author Christoph Strobl
* @author Mark Paluch
* @author Ivan Volzhev
*/
@ExtendWith({ MongoClientExtension.class, SpringExtension.class })
@ContextConfiguration
@@ -329,6 +330,21 @@ public class GeoJsonTests {
assertThat(result.geoJsonMultiPoint).isEqualTo(obj.geoJsonMultiPoint);
}
@Test // DATAMONGO-3776
public void shouldSaveAndRetrieveDocumentWithGeoJsonMultiPointTypeWithOnePointCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonMultiPoint";
obj.geoJsonMultiPoint = new GeoJsonMultiPoint(new Point(0, 0));
template.save(obj);
DocumentWithPropertyUsingGeoJsonType result = template.findOne(query(where("id").is(obj.id)),
DocumentWithPropertyUsingGeoJsonType.class);
assertThat(result.geoJsonMultiPoint).isEqualTo(obj.geoJsonMultiPoint);
}
@Test // DATAMONGO-1137
public void shouldSaveAndRetrieveDocumentWithGeoJsonMultiPolygonTypeCorrectly() {

View File

@@ -237,11 +237,8 @@ class QueryTests {
source.addCriteria(where("From one make ten").is("and two let be."));
Query target = Query.of(source);
compareQueries(target, source);
source.addCriteria(where("Make even three").is("then rich you'll be."));
assertThat(target.getQueryObject()).isEqualTo(new Document("From one make ten", "and two let be."))
.isNotEqualTo(source.getQueryObject());
assertThat(target.getQueryObject()).containsAllEntriesOf(new Document("From one make ten", "and two let be."))
.isNotSameAs(source.getQueryObject());
}
@Test // DATAMONGO-1783
@@ -353,9 +350,12 @@ class QueryTests {
private void compareQueries(Query actual, Query expected) {
assertThat(actual.getCollation()).isEqualTo(expected.getCollation());
assertThat(actual.getSortObject()).isEqualTo(expected.getSortObject());
assertThat(actual.getFieldsObject()).isEqualTo(expected.getFieldsObject());
assertThat(actual.getQueryObject()).isEqualTo(expected.getQueryObject());
assertThat(actual.getSortObject()).hasSameSizeAs(expected.getSortObject())
.containsAllEntriesOf(expected.getSortObject());
assertThat(actual.getFieldsObject()).hasSameSizeAs(expected.getFieldsObject())
.containsAllEntriesOf(expected.getFieldsObject());
assertThat(actual.getQueryObject()).hasSameSizeAs(expected.getQueryObject())
.containsAllEntriesOf(expected.getQueryObject());
assertThat(actual.getHint()).isEqualTo(expected.getHint());
assertThat(actual.getLimit()).isEqualTo(expected.getLimit());
assertThat(actual.getSkip()).isEqualTo(expected.getSkip());

View File

@@ -60,7 +60,9 @@ import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.repository.Person.Sex;
import org.springframework.data.mongodb.repository.SampleEvaluationContextExtension.SampleSecurityContextHolder;
import org.springframework.data.mongodb.test.util.EnableIfMongoServerVersion;
@@ -1422,4 +1424,14 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
Person target = repository.findWithAggregationInProjection(alicia.getId());
assertThat(target.getFirstname()).isEqualTo(alicia.getFirstname().toUpperCase());
}
@Test // GH-3633
void annotatedQueryWithNullEqualityCheckShouldWork() {
operations.updateFirst(Query.query(Criteria.where("id").is(dave.getId())), Update.update("age", null), Person.class);
Person byQueryWithNullEqualityCheck = repository.findByQueryWithNullEqualityCheck();
assertThat(byQueryWithNullEqualityCheck.getId()).isEqualTo(dave.getId());
}
}

View File

@@ -410,4 +410,7 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
List<Person> findByUnwrappedUserUsername(String username);
List<Person> findByUnwrappedUser(User user);
@Query("{ 'age' : null }")
Person findByQueryWithNullEqualityCheck();
}

View File

@@ -127,8 +127,8 @@ public class PartTreeMongoQueryUnitTests {
}
@Test // DATAMONGO-1345, DATAMONGO-1735
public void doesNotDeriveFieldSpecForNormalDomainType() {
assertThat(deriveQueryFromMethod("findPersonBy", new Object[0]).getFieldsObject()).isEqualTo(new Document());
void doesNotDeriveFieldSpecForNormalDomainType() {
assertThat(deriveQueryFromMethod("findPersonBy", new Object[0]).getFieldsObject()).isEmpty();
}
@Test // DATAMONGO-1345
@@ -173,7 +173,7 @@ public class PartTreeMongoQueryUnitTests {
org.springframework.data.mongodb.core.query.Query query = deriveQueryFromMethod("findAllBy");
assertThat(query.getFieldsObject()).isEqualTo(new Document());
assertThat(query.getFieldsObject()).isEmpty();
}
@Test // DATAMONGO-1865

View File

@@ -27,6 +27,8 @@ import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.MongoDatabaseFactory;
@@ -122,6 +124,20 @@ public class QuerydslMongoPredicateExecutorIntegrationTests {
.containsExactly(dave);
}
@Test // GH-3751
public void findPage() {
assertThat(repository
.findAll(person.lastname.startsWith(oliver.getLastname()).and(person.firstname.startsWith(dave.getFirstname())),
PageRequest.of(0, 10))
.getContent()).containsExactly(dave);
assertThat(repository
.findAll(person.lastname.startsWith(oliver.getLastname()).and(person.firstname.startsWith(dave.getFirstname())),
Pageable.unpaged())
.getContent()).containsExactly(dave);
}
@Test // DATAMONGO-362, DATAMONGO-1848
public void springDataMongodbQueryShouldAllowJoinOnDBref() {

View File

@@ -119,6 +119,9 @@ public class MongoTestTemplateConfiguration {
mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(mappingContextConfigurer.initialEntitySet());
mappingContext.setAutoIndexCreation(mappingContextConfigurer.autocreateIndex);
if(mongoConverterConfigurer.customConversions != null) {
mappingContext.setSimpleTypeHolder(mongoConverterConfigurer.customConversions.getSimpleTypeHolder());
}
mappingContext.afterPropertiesSet();
}

View File

@@ -19,7 +19,7 @@ import static org.assertj.core.api.Assertions.*;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Collections;
import org.bson.BsonDouble;
import org.bson.BsonInt32;
@@ -29,10 +29,16 @@ import org.bson.BsonString;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.junit.jupiter.api.Test;
import org.springframework.data.mongodb.util.BsonUtils;
import com.mongodb.BasicDBList;
/**
* Unit tests for {@link BsonUtils}.
*
* @author Christoph Strobl
* @author Mark Paluch
*/
class BsonUtilsTest {
@@ -111,4 +117,13 @@ class BsonUtilsTest {
assertThat((Collection)BsonUtils.asCollection(source)).containsExactly(source);
}
@Test // GH-3702
void supportsBsonShouldReportIfConversionSupported() {
assertThat(BsonUtils.supportsBson("foo")).isFalse();
assertThat(BsonUtils.supportsBson(new Document())).isTrue();
assertThat(BsonUtils.supportsBson(new BasicDBList())).isTrue();
assertThat(BsonUtils.supportsBson(Collections.emptyMap())).isTrue();
}
}

View File

@@ -383,6 +383,13 @@ class ParameterBindingJsonReaderUnitTests {
.parse("{ 'stores.location' : { $geoWithin: { $centerSphere: [ [ 1.948516, 48.799029 ] , 0.004 ] } } }"));
}
@Test // GH-3633
void parsesNullValue() {
Document target = parse("{ 'parent' : null }");
assertThat(target).isEqualTo(new Document("parent", null));
}
private static Document parse(String json, Object... args) {
ParameterBindingJsonReader reader = new ParameterBindingJsonReader(json, args);

View File

@@ -25,8 +25,11 @@ import org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type
import java.util.regex.Pattern
/**
* Unit tests for [Criteria] extensions.
*
* @author Tjeu Kayim
* @author Mark Paluch
* @author Sangyong Choi
*/
class TypedCriteriaExtensionsTests {
@@ -317,6 +320,54 @@ class TypedCriteriaExtensionsTests {
assertThat(typed).isEqualTo(expected)
}
@Test
fun `maxDistance() should equal expected criteria with nearSphere`() {
val point = Point(0.0, 0.0)
val typed = Building::location nearSphere point maxDistance 3.0
val expected = Criteria("location")
.nearSphere(point)
.maxDistance(3.0)
assertThat(typed).isEqualTo(expected)
}
@Test
fun `minDistance() should equal expected criteria with nearSphere`() {
val point = Point(0.0, 0.0)
val typed = Building::location nearSphere point minDistance 3.0
val expected = Criteria("location")
.nearSphere(point)
.minDistance(3.0)
assertThat(typed).isEqualTo(expected)
}
@Test
fun `maxDistance() should equal expected criteria with near`() {
val point = Point(0.0, 0.0)
val typed = Building::location near point maxDistance 3.0
val expected = Criteria("location")
.near(point)
.maxDistance(3.0)
assertThat(typed).isEqualTo(expected)
}
@Test
fun `minDistance() should equal expected criteria with near`() {
val point = Point(0.0, 0.0)
val typed = Building::location near point minDistance 3.0
val expected = Criteria("location")
.near(point)
.minDistance(3.0)
assertThat(typed).isEqualTo(expected)
}
@Test
fun `elemMatch() should equal expected criteria`() {

View File

@@ -0,0 +1,653 @@
[[mongo.aggregation]]
== Aggregation Framework Support
Spring Data MongoDB provides support for the Aggregation Framework introduced to MongoDB in version 2.2.
For further information, see the full https://docs.mongodb.org/manual/aggregation/[reference documentation] of the aggregation framework and other data aggregation tools for MongoDB.
[[mongo.aggregation.basic-concepts]]
=== Basic Concepts
The Aggregation Framework support in Spring Data MongoDB is based on the following key abstractions: `Aggregation`, `AggregationDefinition`, and `AggregationResults`.
* `Aggregation`
+
An `Aggregation` represents a MongoDB `aggregate` operation and holds the description of the aggregation pipeline instructions. Aggregations are created by invoking the appropriate `newAggregation(…)` static factory method of the `Aggregation` class, which takes a list of `AggregateOperation` and an optional input class.
+
The actual aggregate operation is run by the `aggregate` method of the `MongoTemplate`, which takes the desired output class as a parameter.
+
* `TypedAggregation`
+
A `TypedAggregation`, just like an `Aggregation`, holds the instructions of the aggregation pipeline and a reference to the input type, that is used for mapping domain properties to actual document fields.
+
At runtime, field references get checked against the given input type, considering potential `@Field` annotations.
[NOTE]
====
Changed in 3.2 referencing non-existent properties does no longer raise errors. To restore the previous behaviour use the `strictMapping` option of `AggregationOptions`.
====
* `AggregationDefinition`
+
An `AggregationDefinition` represents a MongoDB aggregation pipeline operation and describes the processing that should be performed in this aggregation step. Although you could manually create an `AggregationDefinition`, we recommend using the static factory methods provided by the `Aggregate` class to construct an `AggregateOperation`.
+
* `AggregationResults`
+
`AggregationResults` is the container for the result of an aggregate operation. It provides access to the raw aggregation result, in the form of a `Document` to the mapped objects and other information about the aggregation.
+
The following listing shows the canonical example for using the Spring Data MongoDB support for the MongoDB Aggregation Framework:
+
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
Aggregation agg = newAggregation(
pipelineOP1(),
pipelineOP2(),
pipelineOPn()
);
AggregationResults<OutputType> results = mongoTemplate.aggregate(agg, "INPUT_COLLECTION_NAME", OutputType.class);
List<OutputType> mappedResult = results.getMappedResults();
----
Note that, if you provide an input class as the first parameter to the `newAggregation` method, the `MongoTemplate` derives the name of the input collection from this class. Otherwise, if you do not not specify an input class, you must provide the name of the input collection explicitly. If both an input class and an input collection are provided, the latter takes precedence.
[[mongo.aggregation.supported-aggregation-operations]]
=== Supported Aggregation Operations
The MongoDB Aggregation Framework provides the following types of aggregation operations:
* Pipeline Aggregation Operators
* Group/Accumulator Aggregation Operators
* Boolean Aggregation Operators
* Comparison Aggregation Operators
* Arithmetic Aggregation Operators
* String Aggregation Operators
* Date Aggregation Operators
* Array Aggregation Operators
* Conditional Aggregation Operators
* Lookup Aggregation Operators
* Convert Aggregation Operators
* Object Aggregation Operators
* Script Aggregation Operators
At the time of this writing, we provide support for the following Aggregation Operations in Spring Data MongoDB:
.Aggregation Operations currently supported by Spring Data MongoDB
[cols="2*"]
|===
| Pipeline Aggregation Operators
| `bucket`, `bucketAuto`, `count`, `facet`, `geoNear`, `graphLookup`, `group`, `limit`, `lookup`, `match`, `project`, `replaceRoot`, `skip`, `sort`, `unwind`
| Set Aggregation Operators
| `setEquals`, `setIntersection`, `setUnion`, `setDifference`, `setIsSubset`, `anyElementTrue`, `allElementsTrue`
| Group/Accumulator Aggregation Operators
| `addToSet`, `first`, `last`, `max`, `min`, `avg`, `push`, `sum`, `count` (+++*+++), `stdDevPop`, `stdDevSamp`
| Arithmetic Aggregation Operators
| `abs`, `add` (+++*+++ via `plus`), `ceil`, `divide`, `exp`, `floor`, `ln`, `log`, `log10`, `mod`, `multiply`, `pow`, `round`, `sqrt`, `subtract` (+++*+++ via `minus`), `trunc`
| String Aggregation Operators
| `concat`, `substr`, `toLower`, `toUpper`, `strcasecmp`, `indexOfBytes`, `indexOfCP`, `split`, `strLenBytes`, `strLenCP`, `substrCP`, `trim`, `ltrim`, `rtim`
| Comparison Aggregation Operators
| `eq` (+++*+++ via `is`), `gt`, `gte`, `lt`, `lte`, `ne`
| Array Aggregation Operators
| `arrayElementAt`, `arrayToObject`, `concatArrays`, `filter`, `in`, `indexOfArray`, `isArray`, `range`, `reverseArray`, `reduce`, `size`, `slice`, `zip`
| Literal Operators
| `literal`
| Date Aggregation Operators
| `dayOfYear`, `dayOfMonth`, `dayOfWeek`, `year`, `month`, `week`, `hour`, `minute`, `second`, `millisecond`, `dateToString`, `dateFromString`, `dateFromParts`, `dateToParts`, `isoDayOfWeek`, `isoWeek`, `isoWeekYear`
| Variable Operators
| `map`
| Conditional Aggregation Operators
| `cond`, `ifNull`, `switch`
| Type Aggregation Operators
| `type`
| Convert Aggregation Operators
| `convert`, `toBool`, `toDate`, `toDecimal`, `toDouble`, `toInt`, `toLong`, `toObjectId`, `toString`
| Object Aggregation Operators
| `objectToArray`, `mergeObjects`
| Script Aggregation Operators
| `function`, `accumulator`
|===
+++*+++ The operation is mapped or added by Spring Data MongoDB.
Note that the aggregation operations not listed here are currently not supported by Spring Data MongoDB. Comparison aggregation operators are expressed as `Criteria` expressions.
[[mongo.aggregation.projection]]
=== Projection Expressions
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined through the `project` method of the `Aggregation` class, either by passing a list of `String` objects or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API by using the `and(String)` method and aliased by using the `as(String)` method.
Note that you can also define fields with aliases by using the `Fields.field` static factory method of the aggregation framework, which you can then use to construct a new `Fields` instance. References to projected fields in later aggregation stages are valid only for the field names of included fields or their aliases (including newly defined fields and their aliases). Fields not included in the projection cannot be referenced in later aggregation stages. The following listings show examples of projection expression:
.Projection expression examples
====
[source,java]
----
// generates {$project: {name: 1, netPrice: 1}}
project("name", "netPrice")
// generates {$project: {thing1: $thing2}}
project().and("thing1").as("thing2")
// generates {$project: {a: 1, b: 1, thing2: $thing1}}
project("a","b").and("thing1").as("thing2")
----
====
.Multi-Stage Aggregation using Projection and Sorting
====
[source,java]
----
// generates {$project: {name: 1, netPrice: 1}}, {$sort: {name: 1}}
project("name", "netPrice"), sort(ASC, "name")
// generates {$project: {name: $firstname}}, {$sort: {name: 1}}
project().and("firstname").as("name"), sort(ASC, "name")
// does not work
project().and("firstname").as("name"), sort(ASC, "firstname")
----
====
More examples for project operations can be found in the `AggregationTests` class. Note that further details regarding the projection expressions can be found in the https://docs.mongodb.org/manual/reference/operator/aggregation/project/#pipe._S_project[corresponding section] of the MongoDB Aggregation Framework reference documentation.
[[mongo.aggregation.facet]]
=== Faceted Classification
As of Version 3.4, MongoDB supports faceted classification by using the Aggregation Framework. A faceted classification uses semantic categories (either general or subject-specific) that are combined to create the full classification entry. Documents flowing through the aggregation pipeline are classified into buckets. A multi-faceted classification enables various aggregations on the same set of input documents, without needing to retrieve the input documents multiple times.
==== Buckets
Bucket operations categorize incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. Bucket operations require a grouping field or a grouping expression. You can define them by using the `bucket()` and `bucketAuto()` methods of the `Aggregate` class. `BucketOperation` and `BucketAutoOperation` can expose accumulations based on aggregation expressions for input documents. You can extend the bucket operation with additional parameters through a fluent API by using the `with…()` methods and the `andOutput(String)` method. You can alias the operation by using the `as(String)` method. Each bucket is represented as a document in the output.
`BucketOperation` takes a defined set of boundaries to group incoming documents into these categories. Boundaries are required to be sorted. The following listing shows some examples of bucket operations:
.Bucket operation examples
====
[source,java]
----
// generates {$bucket: {groupBy: $price, boundaries: [0, 100, 400]}}
bucket("price").withBoundaries(0, 100, 400);
// generates {$bucket: {groupBy: $price, default: "Other" boundaries: [0, 100]}}
bucket("price").withBoundaries(0, 100).withDefault("Other");
// generates {$bucket: {groupBy: $price, boundaries: [0, 100], output: { count: { $sum: 1}}}}
bucket("price").withBoundaries(0, 100).andOutputCount().as("count");
// generates {$bucket: {groupBy: $price, boundaries: [0, 100], 5, output: { titles: { $push: "$title"}}}
bucket("price").withBoundaries(0, 100).andOutput("title").push().as("titles");
----
====
`BucketAutoOperation` determines boundaries in an attempt to evenly distribute documents into a specified number of buckets. `BucketAutoOperation` optionally takes a granularity value that specifies the https://en.wikipedia.org/wiki/Preferred_number[preferred number] series to use to ensure that the calculated boundary edges end on preferred round numbers or on powers of 10. The following listing shows examples of bucket operations:
.Bucket operation examples
====
[source,java]
----
// generates {$bucketAuto: {groupBy: $price, buckets: 5}}
bucketAuto("price", 5)
// generates {$bucketAuto: {groupBy: $price, buckets: 5, granularity: "E24"}}
bucketAuto("price", 5).withGranularity(Granularities.E24).withDefault("Other");
// generates {$bucketAuto: {groupBy: $price, buckets: 5, output: { titles: { $push: "$title"}}}
bucketAuto("price", 5).andOutput("title").push().as("titles");
----
====
To create output fields in buckets, bucket operations can use `AggregationExpression` through `andOutput()` and <<mongo.aggregation.projection.expressions, SpEL expressions>> through `andOutputExpression()`.
Note that further details regarding bucket expressions can be found in the https://docs.mongodb.org/manual/reference/operator/aggregation/bucket/[`$bucket` section] and
https://docs.mongodb.org/manual/reference/operator/aggregation/bucketAuto/[`$bucketAuto` section] of the MongoDB Aggregation Framework reference documentation.
==== Multi-faceted Aggregation
Multiple aggregation pipelines can be used to create multi-faceted aggregations that characterize data across multiple dimensions (or facets) within a single aggregation stage. Multi-faceted aggregations provide multiple filters and categorizations to guide data browsing and analysis. A common implementation of faceting is how many online retailers provide ways to narrow down search results by applying filters on product price, manufacturer, size, and other factors.
You can define a `FacetOperation` by using the `facet()` method of the `Aggregation` class. You can customize it with multiple aggregation pipelines by using the `and()` method. Each sub-pipeline has its own field in the output document where its results are stored as an array of documents.
Sub-pipelines can project and filter input documents prior to grouping. Common use cases include extraction of date parts or calculations before categorization. The following listing shows facet operation examples:
.Facet operation examples
====
[source,java]
----
// generates {$facet: {categorizedByPrice: [ { $match: { price: {$exists : true}}}, { $bucketAuto: {groupBy: $price, buckets: 5}}]}}
facet(match(Criteria.where("price").exists(true)), bucketAuto("price", 5)).as("categorizedByPrice"))
// generates {$facet: {categorizedByCountry: [ { $match: { country: {$exists : true}}}, { $sortByCount: "$country"}]}}
facet(match(Criteria.where("country").exists(true)), sortByCount("country")).as("categorizedByCountry"))
// generates {$facet: {categorizedByYear: [
// { $project: { title: 1, publicationYear: { $year: "publicationDate"}}},
// { $bucketAuto: {groupBy: $price, buckets: 5, output: { titles: {$push:"$title"}}}
// ]}}
facet(project("title").and("publicationDate").extractYear().as("publicationYear"),
bucketAuto("publicationYear", 5).andOutput("title").push().as("titles"))
.as("categorizedByYear"))
----
====
Note that further details regarding facet operation can be found in the https://docs.mongodb.org/manual/reference/operator/aggregation/facet/[`$facet` section] of the MongoDB Aggregation Framework reference documentation.
[[mongo.aggregation.sort-by-count]]
==== Sort By Count
Sort by count operations group incoming documents based on the value of a specified expression, compute the count of documents in each distinct group, and sort the results by count. It offers a handy shortcut to apply sorting when using <<mongo.aggregation.facet>>. Sort by count operations require a grouping field or grouping expression. The following listing shows a sort by count example:
.Sort by count example
====
[source,java]
----
// generates { $sortByCount: "$country" }
sortByCount("country");
----
====
A sort by count operation is equivalent to the following BSON (Binary JSON):
----
{ $group: { _id: <expression>, count: { $sum: 1 } } },
{ $sort: { count: -1 } }
----
[[mongo.aggregation.projection.expressions]]
==== Spring Expression Support in Projection Expressions
We support the use of SpEL expressions in projection expressions through the `andExpression` method of the `ProjectionOperation` and `BucketOperation` classes. This feature lets you define the desired expression as a SpEL expression. On running a query, the SpEL expression is translated into a corresponding MongoDB projection expression part. This arrangement makes it much easier to express complex calculations.
===== Complex Calculations with SpEL expressions
Consider the following SpEL expression:
[source,java]
----
1 + (q + 1) / (q - 1)
----
The preceding expression is translated into the following projection expression part:
[source,javascript]
----
{ "$add" : [ 1, {
"$divide" : [ {
"$add":["$q", 1]}, {
"$subtract":[ "$q", 1]}
]
}]}
----
You can see examples in more context in <<mongo.aggregation.examples.example5>> and <<mongo.aggregation.examples.example6>>. You can find more usage examples for supported SpEL expression constructs in `SpelExpressionTransformerUnitTests`. The following table shows the SpEL transformations supported by Spring Data MongoDB:
.Supported SpEL transformations
[%header,cols="2"]
|===
| SpEL Expression
| Mongo Expression Part
| a == b
| { $eq : [$a, $b] }
| a != b
| { $ne : [$a , $b] }
| a > b
| { $gt : [$a, $b] }
| a >= b
| { $gte : [$a, $b] }
| a < b
| { $lt : [$a, $b] }
| a <= b
| { $lte : [$a, $b] }
| a + b
| { $add : [$a, $b] }
| a - b
| { $subtract : [$a, $b] }
| a * b
| { $multiply : [$a, $b] }
| a / b
| { $divide : [$a, $b] }
| a^b
| { $pow : [$a, $b] }
| a % b
| { $mod : [$a, $b] }
| a && b
| { $and : [$a, $b] }
| a \|\| b
| { $or : [$a, $b] }
| !a
| { $not : [$a] }
|===
In addition to the transformations shown in the preceding table, you can use standard SpEL operations such as `new` to (for example) create arrays and reference expressions through their names (followed by the arguments to use in brackets). The following example shows how to create an array in this fashion:
[source,java]
----
// { $setEquals : [$a, [5, 8, 13] ] }
.andExpression("setEquals(a, new int[]{5, 8, 13})");
----
[[mongo.aggregation.examples]]
==== Aggregation Framework Examples
The examples in this section demonstrate the usage patterns for the MongoDB Aggregation Framework with Spring Data MongoDB.
[[mongo.aggregation.examples.example1]]
===== Aggregation Framework Example 1
In this introductory example, we want to aggregate a list of tags to get the occurrence count of a particular tag from a MongoDB collection (called `tags`) sorted by the occurrence count in descending order. This example demonstrates the usage of grouping, sorting, projections (selection), and unwinding (result splitting).
[source,java]
----
class TagCount {
String tag;
int n;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
Aggregation agg = newAggregation(
project("tags"),
unwind("tags"),
group("tags").count().as("n"),
project("n").and("tag").previousOperation(),
sort(DESC, "n")
);
AggregationResults<TagCount> results = mongoTemplate.aggregate(agg, "tags", TagCount.class);
List<TagCount> tagCount = results.getMappedResults();
----
The preceding listing uses the following algorithm:
. Create a new aggregation by using the `newAggregation` static factory method, to which we pass a list of aggregation operations. These aggregate operations define the aggregation pipeline of our `Aggregation`.
. Use the `project` operation to select the `tags` field (which is an array of strings) from the input collection.
. Use the `unwind` operation to generate a new document for each tag within the `tags` array.
. Use the `group` operation to define a group for each `tags` value for which we aggregate the occurrence count (by using the `count` aggregation operator and collecting the result in a new field called `n`).
. Select the `n` field and create an alias for the ID field generated from the previous group operation (hence the call to `previousOperation()`) with a name of `tag`.
. Use the `sort` operation to sort the resulting list of tags by their occurrence count in descending order.
. Call the `aggregate` method on `MongoTemplate` to let MongoDB perform the actual aggregation operation, with the created `Aggregation` as an argument.
Note that the input collection is explicitly specified as the `tags` parameter to the `aggregate` Method. If the name of the input collection is not specified explicitly, it is derived from the input class passed as the first parameter to the `newAggreation` method.
[[mongo.aggregation.examples.example2]]
===== Aggregation Framework Example 2
This example is based on the https://docs.mongodb.org/manual/tutorial/aggregation-examples/#largest-and-smallest-cities-by-state[Largest and Smallest Cities by State] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return the smallest and largest cities by population for each state by using the aggregation framework. This example demonstrates grouping, sorting, and projections (selection).
[source,java]
----
class ZipInfo {
String id;
String city;
String state;
@Field("pop") int population;
@Field("loc") double[] location;
}
class City {
String name;
int population;
}
class ZipInfoStats {
String id;
String state;
City biggestCity;
City smallestCity;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<ZipInfo> aggregation = newAggregation(ZipInfo.class,
group("state", "city")
.sum("population").as("pop"),
sort(ASC, "pop", "state", "city"),
group("state")
.last("city").as("biggestCity")
.last("pop").as("biggestPop")
.first("city").as("smallestCity")
.first("pop").as("smallestPop"),
project()
.and("state").previousOperation()
.and("biggestCity")
.nested(bind("name", "biggestCity").and("population", "biggestPop"))
.and("smallestCity")
.nested(bind("name", "smallestCity").and("population", "smallestPop")),
sort(ASC, "state")
);
AggregationResults<ZipInfoStats> result = mongoTemplate.aggregate(aggregation, ZipInfoStats.class);
ZipInfoStats firstZipInfoStats = result.getMappedResults().get(0);
----
Note that the `ZipInfo` class maps the structure of the given input-collection. The `ZipInfoStats` class defines the structure in the desired output format.
The preceding listings use the following algorithm:
. Use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the `state` and `city` fields, which forms the ID structure of the group. We aggregate the value of the `population` property from the grouped elements by using the `sum` operator and save the result in the `pop` field.
. Use the `sort` operation to sort the intermediate-result by the `pop`, `state` and `city` fields, in ascending order, such that the smallest city is at the top and the biggest city is at the bottom of the result. Note that the sorting on `state` and `city` is implicitly performed against the group ID fields (which Spring Data MongoDB handled).
. Use a `group` operation again to group the intermediate result by `state`. Note that `state` again implicitly references a group ID field. We select the name and the population count of the biggest and smallest city with calls to the `last(…)` and `first(...)` operators, respectively, in the `project` operation.
. Select the `state` field from the previous `group` operation. Note that `state` again implicitly references a group ID field. Because we do not want an implicitly generated ID to appear, we exclude the ID from the previous operation by using `and(previousOperation()).exclude()`. Because we want to populate the nested `City` structures in our output class, we have to emit appropriate sub-documents by using the nested method.
. Sort the resulting list of `StateStats` by their state name in ascending order in the `sort` operation.
Note that we derive the name of the input collection from the `ZipInfo` class passed as the first parameter to the `newAggregation` method.
[[mongo.aggregation.examples.example3]]
===== Aggregation Framework Example 3
This example is based on the https://docs.mongodb.org/manual/tutorial/aggregation-examples/#states-with-populations-over-10-million[States with Populations Over 10 Million] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return all states with a population greater than 10 million, using the aggregation framework. This example demonstrates grouping, sorting, and matching (filtering).
[source,java]
----
class StateStats {
@Id String id;
String state;
@Field("totalPop") int totalPopulation;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<ZipInfo> agg = newAggregation(ZipInfo.class,
group("state").sum("population").as("totalPop"),
sort(ASC, previousOperation(), "totalPop"),
match(where("totalPop").gte(10 * 1000 * 1000))
);
AggregationResults<StateStats> result = mongoTemplate.aggregate(agg, StateStats.class);
List<StateStats> stateStatsList = result.getMappedResults();
----
The preceding listings use the following algorithm:
. Group the input collection by the `state` field and calculate the sum of the `population` field and store the result in the new field `"totalPop"`.
. Sort the intermediate result by the id-reference of the previous group operation in addition to the `"totalPop"` field in ascending order.
. Filter the intermediate result by using a `match` operation which accepts a `Criteria` query as an argument.
Note that we derive the name of the input collection from the `ZipInfo` class passed as first parameter to the `newAggregation` method.
[[mongo.aggregation.examples.example4]]
===== Aggregation Framework Example 4
This example demonstrates the use of simple arithmetic operations in the projection operation.
[source,java]
----
class Product {
String id;
String name;
double netPrice;
int spaceUnits;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<Product> agg = newAggregation(Product.class,
project("name", "netPrice")
.and("netPrice").plus(1).as("netPricePlus1")
.and("netPrice").minus(1).as("netPriceMinus1")
.and("netPrice").multiply(1.19).as("grossPrice")
.and("netPrice").divide(2).as("netPriceDiv2")
.and("spaceUnits").mod(2).as("spaceUnitsMod2")
);
AggregationResults<Document> result = mongoTemplate.aggregate(agg, Document.class);
List<Document> resultList = result.getMappedResults();
----
Note that we derive the name of the input collection from the `Product` class passed as first parameter to the `newAggregation` method.
[[mongo.aggregation.examples.example5]]
===== Aggregation Framework Example 5
This example demonstrates the use of simple arithmetic operations derived from SpEL Expressions in the projection operation.
[source,java]
----
class Product {
String id;
String name;
double netPrice;
int spaceUnits;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<Product> agg = newAggregation(Product.class,
project("name", "netPrice")
.andExpression("netPrice + 1").as("netPricePlus1")
.andExpression("netPrice - 1").as("netPriceMinus1")
.andExpression("netPrice / 2").as("netPriceDiv2")
.andExpression("netPrice * 1.19").as("grossPrice")
.andExpression("spaceUnits % 2").as("spaceUnitsMod2")
.andExpression("(netPrice * 0.8 + 1.2) * 1.19").as("grossPriceIncludingDiscountAndCharge")
);
AggregationResults<Document> result = mongoTemplate.aggregate(agg, Document.class);
List<Document> resultList = result.getMappedResults();
----
[[mongo.aggregation.examples.example6]]
===== Aggregation Framework Example 6
This example demonstrates the use of complex arithmetic operations derived from SpEL Expressions in the projection operation.
Note: The additional parameters passed to the `addExpression` method can be referenced with indexer expressions according to their position. In this example, we reference the first parameter of the parameters array with `[0]`. When the SpEL expression is transformed into a MongoDB aggregation framework expression, external parameter expressions are replaced with their respective values.
[source,java]
----
class Product {
String id;
String name;
double netPrice;
int spaceUnits;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
double shippingCosts = 1.2;
TypedAggregation<Product> agg = newAggregation(Product.class,
project("name", "netPrice")
.andExpression("(netPrice * (1-discountRate) + [0]) * (1+taxRate)", shippingCosts).as("salesPrice")
);
AggregationResults<Document> result = mongoTemplate.aggregate(agg, Document.class);
List<Document> resultList = result.getMappedResults();
----
Note that we can also refer to other fields of the document within the SpEL expression.
[[mongo.aggregation.examples.example7]]
===== Aggregation Framework Example 7
This example uses conditional projection. It is derived from the https://docs.mongodb.com/manual/reference/operator/aggregation/cond/[$cond reference documentation].
[source,java]
----
public class InventoryItem {
@Id int id;
String item;
String description;
int qty;
}
public class InventoryItemProjection {
@Id int id;
String item;
String description;
int qty;
int discount
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<InventoryItem> agg = newAggregation(InventoryItem.class,
project("item").and("discount")
.applyCondition(ConditionalOperator.newBuilder().when(Criteria.where("qty").gte(250))
.then(30)
.otherwise(20))
.and(ifNull("description", "Unspecified")).as("description")
);
AggregationResults<InventoryItemProjection> result = mongoTemplate.aggregate(agg, "inventory", InventoryItemProjection.class);
List<InventoryItemProjection> stateStatsList = result.getMappedResults();
----
This one-step aggregation uses a projection operation with the `inventory` collection. We project the `discount` field by using a conditional operation for all inventory items that have a `qty` greater than or equal to `250`. A second conditional projection is performed for the `description` field. We apply the `Unspecified` description to all items that either do not have a `description` field or items that have a `null` description.
As of MongoDB 3.6, it is possible to exclude fields from the projection by using a conditional expression.
.Conditional aggregation projection
====
[source,java]
----
TypedAggregation<Book> agg = Aggregation.newAggregation(Book.class,
project("title")
.and(ConditionalOperators.when(ComparisonOperators.valueOf("author.middle") <1>
.equalToValue("")) <2>
.then("$$REMOVE") <3>
.otherwiseValueOf("author.middle") <4>
)
.as("author.middle"));
----
<1> If the value of the field `author.middle`
<2> does not contain a value,
<3> then use https://docs.mongodb.com/manual/reference/aggregation-variables/#variable.REMOVE[``$$REMOVE``] to exclude the field.
<4> Otherwise, add the field value of `author.middle`.
====

View File

@@ -0,0 +1,115 @@
[[gridfs]]
== GridFS Support
MongoDB supports storing binary files inside its filesystem, GridFS. Spring Data MongoDB provides a `GridFsOperations` interface as well as the corresponding implementation, `GridFsTemplate`, to let you interact with the filesystem. You can set up a `GridFsTemplate` instance by handing it a `MongoDatabaseFactory` as well as a `MongoConverter`, as the following example shows:
.JavaConfig setup for a GridFsTemplate
====
[source,java]
----
class GridFsConfiguration extends AbstractMongoClientConfiguration {
// … further configuration omitted
@Bean
public GridFsTemplate gridFsTemplate() {
return new GridFsTemplate(mongoDbFactory(), mappingMongoConverter());
}
}
----
====
The corresponding XML configuration follows:
.XML configuration for a GridFsTemplate
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo
https://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans
https://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="mongoDbFactory" dbname="database" />
<mongo:mapping-converter id="converter" />
<bean class="org.springframework.data.mongodb.gridfs.GridFsTemplate">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="converter" />
</bean>
</beans>
----
====
The template can now be injected and used to perform storage and retrieval operations, as the following example shows:
.Using GridFsTemplate to store files
====
[source,java]
----
class GridFsClient {
@Autowired
GridFsOperations operations;
@Test
public void storeFileToGridFs() {
FileMetadata metadata = new FileMetadata();
// populate metadata
Resource file = … // lookup File or Resource
operations.store(file.getInputStream(), "filename.txt", metadata);
}
}
----
====
The `store(…)` operations take an `InputStream`, a filename, and (optionally) metadata information about the file to store. The metadata can be an arbitrary object, which will be marshaled by the `MongoConverter` configured with the `GridFsTemplate`. Alternatively, you can also provide a `Document`.
You can read files from the filesystem through either the `find(…)` or the `getResources(…)` methods. Let's have a look at the `find(…)` methods first. You can either find a single file or multiple files that match a `Query`. You can use the `GridFsCriteria` helper class to define queries. It provides static factory methods to encapsulate default metadata fields (such as `whereFilename()` and `whereContentType()`) or a custom one through `whereMetaData()`. The following example shows how to use `GridFsTemplate` to query for files:
.Using GridFsTemplate to query for files
====
[source,java]
----
class GridFsClient {
@Autowired
GridFsOperations operations;
@Test
public void findFilesInGridFs() {
GridFSFindIterable result = operations.find(query(whereFilename().is("filename.txt")))
}
}
----
====
NOTE: Currently, MongoDB does not support defining sort criteria when retrieving files from GridFS. For this reason, any sort criteria defined on the `Query` instance handed into the `find(…)` method are disregarded.
The other option to read files from the GridFs is to use the methods introduced by the `ResourcePatternResolver` interface. They allow handing an Ant path into the method and can thus retrieve files matching the given pattern. The following example shows how to use `GridFsTemplate` to read files:
.Using GridFsTemplate to read files
====
[source,java]
----
class GridFsClient {
@Autowired
GridFsOperations operations;
@Test
public void readFilesFromGridFs() {
GridFsResources[] txtFiles = operations.getResources("*.txt");
}
}
----
====
`GridFsOperations` extends `ResourcePatternResolver` and lets the `GridFsTemplate` (for example) to be plugged into an `ApplicationContext` to read Spring Config files from MongoDB database.

View File

@@ -214,7 +214,7 @@ public class AppConfig {
----
====
To access the `com.mongodb.client.MongoClient` object created by the `MongoClientFactoryBean` in other `@Configuration` classes or your own classes, use a `private @Autowired Mongo mongo;` field.
To access the `com.mongodb.client.MongoClient` object created by the `MongoClientFactoryBean` in other `@Configuration` classes or your own classes, use a `private @Autowired MongoClient mongoClient;` field.
[[mongo.mongo-xml-config]]
=== Registering a Mongo Instance by Using XML-based Metadata
@@ -2419,660 +2419,7 @@ GroupByResults<XObject> results = mongoTemplate.group(where("x").gt(0),
keyFunction("classpath:keyFunction.js").initialDocument("{ count: 0 }").reduceFunction("classpath:groupReduce.js"), XObject.class);
----
[[mongo.aggregation]]
== Aggregation Framework Support
Spring Data MongoDB provides support for the Aggregation Framework introduced to MongoDB in version 2.2.
For further information, see the full https://docs.mongodb.org/manual/aggregation/[reference documentation] of the aggregation framework and other data aggregation tools for MongoDB.
[[mongo.aggregation.basic-concepts]]
=== Basic Concepts
The Aggregation Framework support in Spring Data MongoDB is based on the following key abstractions: `Aggregation`, `AggregationDefinition`, and `AggregationResults`.
* `Aggregation`
+
An `Aggregation` represents a MongoDB `aggregate` operation and holds the description of the aggregation pipeline instructions. Aggregations are created by invoking the appropriate `newAggregation(…)` static factory method of the `Aggregation` class, which takes a list of `AggregateOperation` and an optional input class.
+
The actual aggregate operation is run by the `aggregate` method of the `MongoTemplate`, which takes the desired output class as a parameter.
+
* `TypedAggregation`
+
A `TypedAggregation`, just like an `Aggregation`, holds the instructions of the aggregation pipeline and a reference to the input type, that is used for mapping domain properties to actual document fields.
+
At runtime, field references get checked against the given input type, considering potential `@Field` annotations.
[NOTE]
====
Changed in 3.2 referencing none-xistent properties does no longer raise errors. To restore the previous behaviour use the `strictMapping` option of `AggregationOptions`.
====
+
* `AggregationDefinition`
+
An `AggregationDefinition` represents a MongoDB aggregation pipeline operation and describes the processing that should be performed in this aggregation step. Although you could manually create an `AggregationDefinition`, we recommend using the static factory methods provided by the `Aggregate` class to construct an `AggregateOperation`.
+
* `AggregationResults`
+
`AggregationResults` is the container for the result of an aggregate operation. It provides access to the raw aggregation result, in the form of a `Document` to the mapped objects and other information about the aggregation.
+
The following listing shows the canonical example for using the Spring Data MongoDB support for the MongoDB Aggregation Framework:
+
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
Aggregation agg = newAggregation(
pipelineOP1(),
pipelineOP2(),
pipelineOPn()
);
AggregationResults<OutputType> results = mongoTemplate.aggregate(agg, "INPUT_COLLECTION_NAME", OutputType.class);
List<OutputType> mappedResult = results.getMappedResults();
----
Note that, if you provide an input class as the first parameter to the `newAggregation` method, the `MongoTemplate` derives the name of the input collection from this class. Otherwise, if you do not not specify an input class, you must provide the name of the input collection explicitly. If both an input class and an input collection are provided, the latter takes precedence.
[[mongo.aggregation.supported-aggregation-operations]]
=== Supported Aggregation Operations
The MongoDB Aggregation Framework provides the following types of aggregation operations:
* Pipeline Aggregation Operators
* Group Aggregation Operators
* Boolean Aggregation Operators
* Comparison Aggregation Operators
* Arithmetic Aggregation Operators
* String Aggregation Operators
* Date Aggregation Operators
* Array Aggregation Operators
* Conditional Aggregation Operators
* Lookup Aggregation Operators
* Convert Aggregation Operators
* Object Aggregation Operators
* Script Aggregation Operators
At the time of this writing, we provide support for the following Aggregation Operations in Spring Data MongoDB:
.Aggregation Operations currently supported by Spring Data MongoDB
[cols="2*"]
|===
| Pipeline Aggregation Operators
| `bucket`, `bucketAuto`, `count`, `facet`, `geoNear`, `graphLookup`, `group`, `limit`, `lookup`, `match`, `project`, `replaceRoot`, `skip`, `sort`, `unwind`
| Set Aggregation Operators
| `setEquals`, `setIntersection`, `setUnion`, `setDifference`, `setIsSubset`, `anyElementTrue`, `allElementsTrue`
| Group Aggregation Operators
| `addToSet`, `first`, `last`, `max`, `min`, `avg`, `push`, `sum`, `(*count)`, `stdDevPop`, `stdDevSamp`
| Arithmetic Aggregation Operators
| `abs`, `add` (*via `plus`), `ceil`, `divide`, `exp`, `floor`, `ln`, `log`, `log10`, `mod`, `multiply`, `pow`, `round`, `sqrt`, `subtract` (*via `minus`), `trunc`
| String Aggregation Operators
| `concat`, `substr`, `toLower`, `toUpper`, `stcasecmp`, `indexOfBytes`, `indexOfCP`, `split`, `strLenBytes`, `strLenCP`, `substrCP`, `trim`, `ltrim`, `rtim`
| Comparison Aggregation Operators
| `eq` (*via: `is`), `gt`, `gte`, `lt`, `lte`, `ne`
| Array Aggregation Operators
| `arrayElementAt`, `arrayToObject`, `concatArrays`, `filter`, `in`, `indexOfArray`, `isArray`, `range`, `reverseArray`, `reduce`, `size`, `slice`, `zip`
| Literal Operators
| `literal`
| Date Aggregation Operators
| `dayOfYear`, `dayOfMonth`, `dayOfWeek`, `year`, `month`, `week`, `hour`, `minute`, `second`, `millisecond`, `dateToString`, `dateFromString`, `dateFromParts`, `dateToParts`, `isoDayOfWeek`, `isoWeek`, `isoWeekYear`
| Variable Operators
| `map`
| Conditional Aggregation Operators
| `cond`, `ifNull`, `switch`
| Type Aggregation Operators
| `type`
| Convert Aggregation Operators
| `convert`, `toBool`, `toDate`, `toDecimal`, `toDouble`, `toInt`, `toLong`, `toObjectId`, `toString`
| Object Aggregation Operators
| `objectToArray`, `mergeObjects`
| Script Aggregation Operators
| `function`, `accumulator`
|===
* The operation is mapped or added by Spring Data MongoDB.
Note that the aggregation operations not listed here are currently not supported by Spring Data MongoDB. Comparison aggregation operators are expressed as `Criteria` expressions.
[[mongo.aggregation.projection]]
=== Projection Expressions
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined through the `project` method of the `Aggregation` class, either by passing a list of `String` objects or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API by using the `and(String)` method and aliased by using the `as(String)` method.
Note that you can also define fields with aliases by using the `Fields.field` static factory method of the aggregation framework, which you can then use to construct a new `Fields` instance. References to projected fields in later aggregation stages are valid only for the field names of included fields or their aliases (including newly defined fields and their aliases). Fields not included in the projection cannot be referenced in later aggregation stages. The following listings show examples of projection expression:
.Projection expression examples
====
[source,java]
----
// generates {$project: {name: 1, netPrice: 1}}
project("name", "netPrice")
// generates {$project: {thing1: $thing2}}
project().and("thing1").as("thing2")
// generates {$project: {a: 1, b: 1, thing2: $thing1}}
project("a","b").and("thing1").as("thing2")
----
====
.Multi-Stage Aggregation using Projection and Sorting
====
[source,java]
----
// generates {$project: {name: 1, netPrice: 1}}, {$sort: {name: 1}}
project("name", "netPrice"), sort(ASC, "name")
// generates {$project: {name: $firstname}}, {$sort: {name: 1}}
project().and("firstname").as("name"), sort(ASC, "name")
// does not work
project().and("firstname").as("name"), sort(ASC, "firstname")
----
====
More examples for project operations can be found in the `AggregationTests` class. Note that further details regarding the projection expressions can be found in the https://docs.mongodb.org/manual/reference/operator/aggregation/project/#pipe._S_project[corresponding section] of the MongoDB Aggregation Framework reference documentation.
[[mongo.aggregation.facet]]
=== Faceted Classification
As of Version 3.4, MongoDB supports faceted classification by using the Aggregation Framework. A faceted classification uses semantic categories (either general or subject-specific) that are combined to create the full classification entry. Documents flowing through the aggregation pipeline are classified into buckets. A multi-faceted classification enables various aggregations on the same set of input documents, without needing to retrieve the input documents multiple times.
==== Buckets
Bucket operations categorize incoming documents into groups, called buckets, based on a specified expression and bucket boundaries. Bucket operations require a grouping field or a grouping expression. You can define them by using the `bucket()` and `bucketAuto()` methods of the `Aggregate` class. `BucketOperation` and `BucketAutoOperation` can expose accumulations based on aggregation expressions for input documents. You can extend the bucket operation with additional parameters through a fluent API by using the `with…()` methods and the `andOutput(String)` method. You can alias the operation by using the `as(String)` method. Each bucket is represented as a document in the output.
`BucketOperation` takes a defined set of boundaries to group incoming documents into these categories. Boundaries are required to be sorted. The following listing shows some examples of bucket operations:
.Bucket operation examples
====
[source,java]
----
// generates {$bucket: {groupBy: $price, boundaries: [0, 100, 400]}}
bucket("price").withBoundaries(0, 100, 400);
// generates {$bucket: {groupBy: $price, default: "Other" boundaries: [0, 100]}}
bucket("price").withBoundaries(0, 100).withDefault("Other");
// generates {$bucket: {groupBy: $price, boundaries: [0, 100], output: { count: { $sum: 1}}}}
bucket("price").withBoundaries(0, 100).andOutputCount().as("count");
// generates {$bucket: {groupBy: $price, boundaries: [0, 100], 5, output: { titles: { $push: "$title"}}}
bucket("price").withBoundaries(0, 100).andOutput("title").push().as("titles");
----
====
`BucketAutoOperation` determines boundaries in an attempt to evenly distribute documents into a specified number of buckets. `BucketAutoOperation` optionally takes a granularity value that specifies the https://en.wikipedia.org/wiki/Preferred_number[preferred number] series to use to ensure that the calculated boundary edges end on preferred round numbers or on powers of 10. The following listing shows examples of bucket operations:
.Bucket operation examples
====
[source,java]
----
// generates {$bucketAuto: {groupBy: $price, buckets: 5}}
bucketAuto("price", 5)
// generates {$bucketAuto: {groupBy: $price, buckets: 5, granularity: "E24"}}
bucketAuto("price", 5).withGranularity(Granularities.E24).withDefault("Other");
// generates {$bucketAuto: {groupBy: $price, buckets: 5, output: { titles: { $push: "$title"}}}
bucketAuto("price", 5).andOutput("title").push().as("titles");
----
====
To create output fields in buckets, bucket operations can use `AggregationExpression` through `andOutput()` and <<mongo.aggregation.projection.expressions, SpEL expressions>> through `andOutputExpression()`.
Note that further details regarding bucket expressions can be found in the https://docs.mongodb.org/manual/reference/operator/aggregation/bucket/[`$bucket` section] and
https://docs.mongodb.org/manual/reference/operator/aggregation/bucketAuto/[`$bucketAuto` section] of the MongoDB Aggregation Framework reference documentation.
==== Multi-faceted Aggregation
Multiple aggregation pipelines can be used to create multi-faceted aggregations that characterize data across multiple dimensions (or facets) within a single aggregation stage. Multi-faceted aggregations provide multiple filters and categorizations to guide data browsing and analysis. A common implementation of faceting is how many online retailers provide ways to narrow down search results by applying filters on product price, manufacturer, size, and other factors.
You can define a `FacetOperation` by using the `facet()` method of the `Aggregation` class. You can customize it with multiple aggregation pipelines by using the `and()` method. Each sub-pipeline has its own field in the output document where its results are stored as an array of documents.
Sub-pipelines can project and filter input documents prior to grouping. Common use cases include extraction of date parts or calculations before categorization. The following listing shows facet operation examples:
.Facet operation examples
====
[source,java]
----
// generates {$facet: {categorizedByPrice: [ { $match: { price: {$exists : true}}}, { $bucketAuto: {groupBy: $price, buckets: 5}}]}}
facet(match(Criteria.where("price").exists(true)), bucketAuto("price", 5)).as("categorizedByPrice"))
// generates {$facet: {categorizedByCountry: [ { $match: { country: {$exists : true}}}, { $sortByCount: "$country"}]}}
facet(match(Criteria.where("country").exists(true)), sortByCount("country")).as("categorizedByCountry"))
// generates {$facet: {categorizedByYear: [
// { $project: { title: 1, publicationYear: { $year: "publicationDate"}}},
// { $bucketAuto: {groupBy: $price, buckets: 5, output: { titles: {$push:"$title"}}}
// ]}}
facet(project("title").and("publicationDate").extractYear().as("publicationYear"),
bucketAuto("publicationYear", 5).andOutput("title").push().as("titles"))
.as("categorizedByYear"))
----
====
Note that further details regarding facet operation can be found in the https://docs.mongodb.org/manual/reference/operator/aggregation/facet/[`$facet` section] of the MongoDB Aggregation Framework reference documentation.
[[mongo.aggregation.sort-by-count]]
==== Sort By Count
Sort by count operations group incoming documents based on the value of a specified expression, compute the count of documents in each distinct group, and sort the results by count. It offers a handy shortcut to apply sorting when using <<mongo.aggregation.facet>>. Sort by count operations require a grouping field or grouping expression. The following listing shows a sort by count example:
.Sort by count example
====
[source,java]
----
// generates { $sortByCount: "$country" }
sortByCount("country");
----
====
A sort by count operation is equivalent to the following BSON (Binary JSON):
----
{ $group: { _id: <expression>, count: { $sum: 1 } } },
{ $sort: { count: -1 } }
----
[[mongo.aggregation.projection.expressions]]
==== Spring Expression Support in Projection Expressions
We support the use of SpEL expressions in projection expressions through the `andExpression` method of the `ProjectionOperation` and `BucketOperation` classes. This feature lets you define the desired expression as a SpEL expression. On running a query, the SpEL expression is translated into a corresponding MongoDB projection expression part. This arrangement makes it much easier to express complex calculations.
===== Complex Calculations with SpEL expressions
Consider the following SpEL expression:
[source,java]
----
1 + (q + 1) / (q - 1)
----
The preceding expression is translated into the following projection expression part:
[source,javascript]
----
{ "$add" : [ 1, {
"$divide" : [ {
"$add":["$q", 1]}, {
"$subtract":[ "$q", 1]}
]
}]}
----
You can see examples in more context in <<mongo.aggregation.examples.example5>> and <<mongo.aggregation.examples.example6>>. You can find more usage examples for supported SpEL expression constructs in `SpelExpressionTransformerUnitTests`. The following table shows the SpEL transformations supported by Spring Data MongoDB:
.Supported SpEL transformations
[%header,cols="2"]
|===
| SpEL Expression
| Mongo Expression Part
| a == b
| { $eq : [$a, $b] }
| a != b
| { $ne : [$a , $b] }
| a > b
| { $gt : [$a, $b] }
| a >= b
| { $gte : [$a, $b] }
| a < b
| { $lt : [$a, $b] }
| a <= b
| { $lte : [$a, $b] }
| a + b
| { $add : [$a, $b] }
| a - b
| { $subtract : [$a, $b] }
| a * b
| { $multiply : [$a, $b] }
| a / b
| { $divide : [$a, $b] }
| a^b
| { $pow : [$a, $b] }
| a % b
| { $mod : [$a, $b] }
| a && b
| { $and : [$a, $b] }
| a \|\| b
| { $or : [$a, $b] }
| !a
| { $not : [$a] }
|===
In addition to the transformations shown in the preceding table, you can use standard SpEL operations such as `new` to (for example) create arrays and reference expressions through their names (followed by the arguments to use in brackets). The following example shows how to create an array in this fashion:
[source,java]
----
// { $setEquals : [$a, [5, 8, 13] ] }
.andExpression("setEquals(a, new int[]{5, 8, 13})");
----
[[mongo.aggregation.examples]]
==== Aggregation Framework Examples
The examples in this section demonstrate the usage patterns for the MongoDB Aggregation Framework with Spring Data MongoDB.
[[mongo.aggregation.examples.example1]]
===== Aggregation Framework Example 1
In this introductory example, we want to aggregate a list of tags to get the occurrence count of a particular tag from a MongoDB collection (called `tags`) sorted by the occurrence count in descending order. This example demonstrates the usage of grouping, sorting, projections (selection), and unwinding (result splitting).
[source,java]
----
class TagCount {
String tag;
int n;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
Aggregation agg = newAggregation(
project("tags"),
unwind("tags"),
group("tags").count().as("n"),
project("n").and("tag").previousOperation(),
sort(DESC, "n")
);
AggregationResults<TagCount> results = mongoTemplate.aggregate(agg, "tags", TagCount.class);
List<TagCount> tagCount = results.getMappedResults();
----
The preceding listing uses the following algorithm:
. Create a new aggregation by using the `newAggregation` static factory method, to which we pass a list of aggregation operations. These aggregate operations define the aggregation pipeline of our `Aggregation`.
. Use the `project` operation to select the `tags` field (which is an array of strings) from the input collection.
. Use the `unwind` operation to generate a new document for each tag within the `tags` array.
. Use the `group` operation to define a group for each `tags` value for which we aggregate the occurrence count (by using the `count` aggregation operator and collecting the result in a new field called `n`).
. Select the `n` field and create an alias for the ID field generated from the previous group operation (hence the call to `previousOperation()`) with a name of `tag`.
. Use the `sort` operation to sort the resulting list of tags by their occurrence count in descending order.
. Call the `aggregate` method on `MongoTemplate` to let MongoDB perform the actual aggregation operation, with the created `Aggregation` as an argument.
Note that the input collection is explicitly specified as the `tags` parameter to the `aggregate` Method. If the name of the input collection is not specified explicitly, it is derived from the input class passed as the first parameter to the `newAggreation` method.
[[mongo.aggregation.examples.example2]]
===== Aggregation Framework Example 2
This example is based on the https://docs.mongodb.org/manual/tutorial/aggregation-examples/#largest-and-smallest-cities-by-state[Largest and Smallest Cities by State] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return the smallest and largest cities by population for each state by using the aggregation framework. This example demonstrates grouping, sorting, and projections (selection).
[source,java]
----
class ZipInfo {
String id;
String city;
String state;
@Field("pop") int population;
@Field("loc") double[] location;
}
class City {
String name;
int population;
}
class ZipInfoStats {
String id;
String state;
City biggestCity;
City smallestCity;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<ZipInfo> aggregation = newAggregation(ZipInfo.class,
group("state", "city")
.sum("population").as("pop"),
sort(ASC, "pop", "state", "city"),
group("state")
.last("city").as("biggestCity")
.last("pop").as("biggestPop")
.first("city").as("smallestCity")
.first("pop").as("smallestPop"),
project()
.and("state").previousOperation()
.and("biggestCity")
.nested(bind("name", "biggestCity").and("population", "biggestPop"))
.and("smallestCity")
.nested(bind("name", "smallestCity").and("population", "smallestPop")),
sort(ASC, "state")
);
AggregationResults<ZipInfoStats> result = mongoTemplate.aggregate(aggregation, ZipInfoStats.class);
ZipInfoStats firstZipInfoStats = result.getMappedResults().get(0);
----
Note that the `ZipInfo` class maps the structure of the given input-collection. The `ZipInfoStats` class defines the structure in the desired output format.
The preceding listings use the following algorithm:
. Use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the `state` and `city` fields, which forms the ID structure of the group. We aggregate the value of the `population` property from the grouped elements by using the `sum` operator and save the result in the `pop` field.
. Use the `sort` operation to sort the intermediate-result by the `pop`, `state` and `city` fields, in ascending order, such that the smallest city is at the top and the biggest city is at the bottom of the result. Note that the sorting on `state` and `city` is implicitly performed against the group ID fields (which Spring Data MongoDB handled).
. Use a `group` operation again to group the intermediate result by `state`. Note that `state` again implicitly references a group ID field. We select the name and the population count of the biggest and smallest city with calls to the `last(…)` and `first(...)` operators, respectively, in the `project` operation.
. Select the `state` field from the previous `group` operation. Note that `state` again implicitly references a group ID field. Because we do not want an implicitly generated ID to appear, we exclude the ID from the previous operation by using `and(previousOperation()).exclude()`. Because we want to populate the nested `City` structures in our output class, we have to emit appropriate sub-documents by using the nested method.
. Sort the resulting list of `StateStats` by their state name in ascending order in the `sort` operation.
Note that we derive the name of the input collection from the `ZipInfo` class passed as the first parameter to the `newAggregation` method.
[[mongo.aggregation.examples.example3]]
===== Aggregation Framework Example 3
This example is based on the https://docs.mongodb.org/manual/tutorial/aggregation-examples/#states-with-populations-over-10-million[States with Populations Over 10 Million] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return all states with a population greater than 10 million, using the aggregation framework. This example demonstrates grouping, sorting, and matching (filtering).
[source,java]
----
class StateStats {
@Id String id;
String state;
@Field("totalPop") int totalPopulation;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<ZipInfo> agg = newAggregation(ZipInfo.class,
group("state").sum("population").as("totalPop"),
sort(ASC, previousOperation(), "totalPop"),
match(where("totalPop").gte(10 * 1000 * 1000))
);
AggregationResults<StateStats> result = mongoTemplate.aggregate(agg, StateStats.class);
List<StateStats> stateStatsList = result.getMappedResults();
----
The preceding listings use the following algorithm:
. Group the input collection by the `state` field and calculate the sum of the `population` field and store the result in the new field `"totalPop"`.
. Sort the intermediate result by the id-reference of the previous group operation in addition to the `"totalPop"` field in ascending order.
. Filter the intermediate result by using a `match` operation which accepts a `Criteria` query as an argument.
Note that we derive the name of the input collection from the `ZipInfo` class passed as first parameter to the `newAggregation` method.
[[mongo.aggregation.examples.example4]]
===== Aggregation Framework Example 4
This example demonstrates the use of simple arithmetic operations in the projection operation.
[source,java]
----
class Product {
String id;
String name;
double netPrice;
int spaceUnits;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<Product> agg = newAggregation(Product.class,
project("name", "netPrice")
.and("netPrice").plus(1).as("netPricePlus1")
.and("netPrice").minus(1).as("netPriceMinus1")
.and("netPrice").multiply(1.19).as("grossPrice")
.and("netPrice").divide(2).as("netPriceDiv2")
.and("spaceUnits").mod(2).as("spaceUnitsMod2")
);
AggregationResults<Document> result = mongoTemplate.aggregate(agg, Document.class);
List<Document> resultList = result.getMappedResults();
----
Note that we derive the name of the input collection from the `Product` class passed as first parameter to the `newAggregation` method.
[[mongo.aggregation.examples.example5]]
===== Aggregation Framework Example 5
This example demonstrates the use of simple arithmetic operations derived from SpEL Expressions in the projection operation.
[source,java]
----
class Product {
String id;
String name;
double netPrice;
int spaceUnits;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<Product> agg = newAggregation(Product.class,
project("name", "netPrice")
.andExpression("netPrice + 1").as("netPricePlus1")
.andExpression("netPrice - 1").as("netPriceMinus1")
.andExpression("netPrice / 2").as("netPriceDiv2")
.andExpression("netPrice * 1.19").as("grossPrice")
.andExpression("spaceUnits % 2").as("spaceUnitsMod2")
.andExpression("(netPrice * 0.8 + 1.2) * 1.19").as("grossPriceIncludingDiscountAndCharge")
);
AggregationResults<Document> result = mongoTemplate.aggregate(agg, Document.class);
List<Document> resultList = result.getMappedResults();
----
[[mongo.aggregation.examples.example6]]
===== Aggregation Framework Example 6
This example demonstrates the use of complex arithmetic operations derived from SpEL Expressions in the projection operation.
Note: The additional parameters passed to the `addExpression` method can be referenced with indexer expressions according to their position. In this example, we reference the first parameter of the parameters array with `[0]`. When the SpEL expression is transformed into a MongoDB aggregation framework expression, external parameter expressions are replaced with their respective values.
[source,java]
----
class Product {
String id;
String name;
double netPrice;
int spaceUnits;
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
double shippingCosts = 1.2;
TypedAggregation<Product> agg = newAggregation(Product.class,
project("name", "netPrice")
.andExpression("(netPrice * (1-discountRate) + [0]) * (1+taxRate)", shippingCosts).as("salesPrice")
);
AggregationResults<Document> result = mongoTemplate.aggregate(agg, Document.class);
List<Document> resultList = result.getMappedResults();
----
Note that we can also refer to other fields of the document within the SpEL expression.
[[mongo.aggregation.examples.example7]]
===== Aggregation Framework Example 7
This example uses conditional projection. It is derived from the https://docs.mongodb.com/manual/reference/operator/aggregation/cond/[$cond reference documentation].
[source,java]
----
public class InventoryItem {
@Id int id;
String item;
String description;
int qty;
}
public class InventoryItemProjection {
@Id int id;
String item;
String description;
int qty;
int discount
}
----
[source,java]
----
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
TypedAggregation<InventoryItem> agg = newAggregation(InventoryItem.class,
project("item").and("discount")
.applyCondition(ConditionalOperator.newBuilder().when(Criteria.where("qty").gte(250))
.then(30)
.otherwise(20))
.and(ifNull("description", "Unspecified")).as("description")
);
AggregationResults<InventoryItemProjection> result = mongoTemplate.aggregate(agg, "inventory", InventoryItemProjection.class);
List<InventoryItemProjection> stateStatsList = result.getMappedResults();
----
This one-step aggregation uses a projection operation with the `inventory` collection. We project the `discount` field by using a conditional operation for all inventory items that have a `qty` greater than or equal to `250`. A second conditional projection is performed for the `description` field. We apply the `Unspecified` description to all items that either do not have a `description` field or items that have a `null` description.
As of MongoDB 3.6, it is possible to exclude fields from the projection by using a conditional expression.
.Conditional aggregation projection
====
[source,java]
----
TypedAggregation<Book> agg = Aggregation.newAggregation(Book.class,
project("title")
.and(ConditionalOperators.when(ComparisonOperators.valueOf("author.middle") <1>
.equalToValue("")) <2>
.then("$$REMOVE") <3>
.otherwiseValueOf("author.middle") <4>
)
.as("author.middle"));
----
<1> If the value of the field `author.middle`
<2> does not contain a value,
<3> then use https://docs.mongodb.com/manual/reference/aggregation-variables/#variable.REMOVE[``$$REMOVE``] to exclude the field.
<4> Otherwise, add the field value of `author.middle`.
====
include::aggregation-framework.adoc[]
[[mongo-template.index-and-collections]]
== Index and Collection Management
@@ -3265,121 +2612,6 @@ boolean hasIndex = template.execute("geolocation", new CollectionCallbackBoolean
});
----
[[gridfs]]
== GridFS Support
MongoDB supports storing binary files inside its filesystem, GridFS. Spring Data MongoDB provides a `GridFsOperations` interface as well as the corresponding implementation, `GridFsTemplate`, to let you interact with the filesystem. You can set up a `GridFsTemplate` instance by handing it a `MongoDatabaseFactory` as well as a `MongoConverter`, as the following example shows:
.JavaConfig setup for a GridFsTemplate
====
[source,java]
----
class GridFsConfiguration extends AbstractMongoClientConfiguration {
// … further configuration omitted
@Bean
public GridFsTemplate gridFsTemplate() {
return new GridFsTemplate(mongoDbFactory(), mappingMongoConverter());
}
}
----
====
The corresponding XML configuration follows:
.XML configuration for a GridFsTemplate
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo
https://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans
https://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="mongoDbFactory" dbname="database" />
<mongo:mapping-converter id="converter" />
<bean class="org.springframework.data.mongodb.gridfs.GridFsTemplate">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="converter" />
</bean>
</beans>
----
====
The template can now be injected and used to perform storage and retrieval operations, as the following example shows:
.Using GridFsTemplate to store files
====
[source,java]
----
class GridFsClient {
@Autowired
GridFsOperations operations;
@Test
public void storeFileToGridFs() {
FileMetadata metadata = new FileMetadata();
// populate metadata
Resource file = … // lookup File or Resource
operations.store(file.getInputStream(), "filename.txt", metadata);
}
}
----
====
The `store(…)` operations take an `InputStream`, a filename, and (optionally) metadata information about the file to store. The metadata can be an arbitrary object, which will be marshaled by the `MongoConverter` configured with the `GridFsTemplate`. Alternatively, you can also provide a `Document`.
You can read files from the filesystem through either the `find(…)` or the `getResources(…)` methods. Let's have a look at the `find(…)` methods first. You can either find a single file or multiple files that match a `Query`. You can use the `GridFsCriteria` helper class to define queries. It provides static factory methods to encapsulate default metadata fields (such as `whereFilename()` and `whereContentType()`) or a custom one through `whereMetaData()`. The following example shows how to use `GridFsTemplate` to query for files:
.Using GridFsTemplate to query for files
====
[source,java]
----
class GridFsClient {
@Autowired
GridFsOperations operations;
@Test
public void findFilesInGridFs() {
GridFSFindIterable result = operations.find(query(whereFilename().is("filename.txt")))
}
}
----
====
NOTE: Currently, MongoDB does not support defining sort criteria when retrieving files from GridFS. For this reason, any sort criteria defined on the `Query` instance handed into the `find(…)` method are disregarded.
The other option to read files from the GridFs is to use the methods introduced by the `ResourcePatternResolver` interface. They allow handing an Ant path into the method and can thus retrieve files matching the given pattern. The following example shows how to use `GridFsTemplate` to read files:
.Using GridFsTemplate to read files
====
[source,java]
----
class GridFsClient {
@Autowired
GridFsOperations operations;
@Test
public void readFilesFromGridFs() {
GridFsResources[] txtFiles = operations.getResources("*.txt");
}
}
----
====
`GridFsOperations` extends `ResourcePatternResolver` and lets the `GridFsTemplate` (for example) to be plugged into an `ApplicationContext` to read Spring Config files from MongoDB database.
include::gridfs.adoc[]
include::tailable-cursors.adoc[]
include::change-streams.adoc[]

View File

@@ -1,6 +1,47 @@
Spring Data MongoDB Changelog
=============================
Changes in version 3.2.3 (2021-07-16)
-------------------------------------
* #3702 - `MappingMongoConverter` incorrectly processes an object property of type `org.bson.Document`.
* #3689 - Fix Regression in generating queries with nested maps with numeric keys.
* #3688 - Multiple maps with numeric keys in a single update produces the wrong query (Regression).
* #3686 - reading a document with a list with a null element fails with Spring Data Mongo 3.2.2, works with 3.2.1.
* #3684 - Add equals and hashcode to UnwrappedMongoPersistentProperty (fixes #3683).
* #3683 - Memory Leak: instances of UnwrappedMongoPersistentProperty are accumulating in PreferredConstructor.isPropertyParameterCache.
* #3670 - `Binary` not deserialized to `byte[]` for property of type `Object`.
Changes in version 3.2.2 (2021-06-22)
-------------------------------------
* #3677 - Add missing double quote to GeoJson.java JSDoc header.
* #3668 - Projection on the _id field returns wrong result when using `@MongoId` (MongoDB 4.4).
* #3666 - Documentation references outdated `Mongo` client.
* #3660 - MappingMongoConverter problem: ConversionContext#convert does not try to use custom converters first.
* #3659 - [3.2.1] Indexing Class with Custom Converter -> Couldn't find PersistentEntity for property private [...].
* #3635 - $floor isOrOrNor() return true.
* #3633 - NPE in QueryMapper when use Query with `null` as value.
Changes in version 3.1.10 (2021-06-22)
--------------------------------------
* #3677 - Add missing double quote to GeoJson.java JSDoc header.
* #3666 - Documentation references outdated `Mongo` client.
* #3659 - [3.2.1] Indexing Class with Custom Converter -> Couldn't find PersistentEntity for property private [...].
* #3635 - $floor isOrOrNor() return true.
* #3633 - NPE in QueryMapper when use Query with `null` as value.
Changes in version 3.2.1 (2021-05-14)
-------------------------------------
* #3638 - Introduce template method for easier customization of fragments.
* #3632 - Fix bullet points in aggregations framework asciidoc.
Changes in version 3.1.9 (2021-05-14)
-------------------------------------
Changes in version 3.2.0 (2021-04-14)
-------------------------------------
* #3623 - `@Aggregation` repository query method causes `NullPointerException` when the result is empty.
@@ -3426,6 +3467,11 @@ Repository

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 3.2 GA (2021.0.0)
Spring Data MongoDB 3.2.6 (2021.0.6)
Copyright (c) [2010-2019] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
@@ -22,6 +22,12 @@ conditions of the subcomponent's license, as noted in the LICENSE file.