Compare commits

..

70 Commits
3.1.3 ... 3.1.9

Author SHA1 Message Date
Mark Paluch
9ef1386784 Release version 3.1.9 (2020.0.9).
See #3628
2021-05-14 11:52:23 +02:00
Mark Paluch
696fd725c3 Prepare 3.1.9 (2020.0.9).
See #3628
2021-05-14 11:51:53 +02:00
Mark Paluch
e6fda2ccdd Updated changelog.
See #3628
2021-05-14 11:51:46 +02:00
Greg L. Turnquist
7a24bab9a2 Authenticate with artifactory.
See #3616.
2021-04-22 15:04:01 -05:00
Mark Paluch
38b7fb7105 Updated changelog.
See #3616
2021-04-14 14:40:02 +02:00
Mark Paluch
d42d06e058 After release cleanups.
See #3617
2021-04-14 11:42:08 +02:00
Mark Paluch
e2709abfe0 Prepare next development iteration.
See #3617
2021-04-14 11:42:04 +02:00
Mark Paluch
12b4aab834 Release version 3.1.8 (2020.0.8).
See #3617
2021-04-14 11:33:25 +02:00
Mark Paluch
db06756c8f Prepare 3.1.8 (2020.0.8).
See #3617
2021-04-14 11:32:47 +02:00
Mark Paluch
b319b8a589 Updated changelog.
See #3617
2021-04-14 11:32:42 +02:00
Mark Paluch
a516795759 Updated changelog.
See #3597
2021-04-14 11:17:41 +02:00
Mark Paluch
bab08502a5 Polishing.
Fix nullability annotations for isEqual(…) parameters. Fix generics. Reformat code.

Add tests.

See #3414
Original pull request: #3615.
2021-04-13 09:40:00 +02:00
Clement Petit
3e1f95bc94 Handle nested Pattern and Document in Criteria.equals(…).
Closes #3414
Original pull request: #3615.
2021-04-13 09:39:59 +02:00
Mark Paluch
5c153dc76e Polishing.
Use ObjectUtils for empty check.

See #3623
Original pull request: #3625.
2021-04-13 09:09:42 +02:00
Christoph Strobl
8f4e207d97 Fix NPE in declarative aggregation execution.
This commit fixes an issue where using a simple return type leads to NPE when the actual aggregation result does not contain any values.

Closes: #3623
Original pull request: #3625.
2021-04-13 09:09:42 +02:00
Christoph Strobl
5000a40d72 Fix query mapping resolution of properties using underscore within field name.
Closes: #3601
Original pull request: #3607.
2021-04-09 12:27:08 +02:00
Mark Paluch
fb59f49dae After release cleanups.
See #3598
2021-03-31 18:29:41 +02:00
Mark Paluch
f3c1e014e9 Prepare next development iteration.
See #3598
2021-03-31 18:29:38 +02:00
Mark Paluch
f52cc3be1f Release version 3.1.7 (2020.0.7).
See #3598
2021-03-31 18:19:52 +02:00
Mark Paluch
1bda93858c Prepare 3.1.7 (2020.0.7).
See #3598
2021-03-31 18:19:19 +02:00
Mark Paluch
1808970daf Updated changelog.
See #3598
2021-03-31 18:19:13 +02:00
Mark Paluch
558fc28cce Updated changelog.
See #3595
2021-03-31 17:26:09 +02:00
Mark Paluch
16bef54f11 Use StringUtils.replace(…) instead of String.replaceAll(…) for mapKeyDotReplacement.
We now use StringUtils.replace(…) to replace the map key dot in MappingMongoConverter. StringUtils perform a plain search instead of using Regex which improves the overall performance.

Closes #3613
2021-03-30 14:29:50 +02:00
Mark Paluch
d68a812e1b Polishing.
Omit StreamUtils usage if input is a collection. Remove superfluous Flux.from(…). Simplify test and migrate test to JUnit 5.

See #3609.
Original pull request: #3611.
2021-03-29 11:02:34 +02:00
Clément Petit
ccb9f111d9 Return saved entity reference instead of original reference.
Make SimpleReactiveMongoRepository#saveAll(Publisher<S>) return the saved entity references instead of the original references.

Closes #3609
Original pull request: #3611.
2021-03-29 10:55:36 +02:00
Mark Paluch
f64b177c8f Updated changelog.
See #3558
2021-03-17 11:31:32 +01:00
Mark Paluch
c0c7ba767f After release cleanups.
See #3561
2021-03-17 11:02:15 +01:00
Mark Paluch
7639701f3f Prepare next development iteration.
See #3561
2021-03-17 11:02:13 +01:00
Mark Paluch
b39b2591b6 Release version 3.1.6 (2020.0.6).
See #3561
2021-03-17 10:54:12 +01:00
Mark Paluch
65c8317e38 Prepare 3.1.6 (2020.0.6).
See #3561
2021-03-17 10:53:43 +01:00
Mark Paluch
9d0f7bac6a Updated changelog.
See #3561
2021-03-17 10:53:41 +01:00
Mark Paluch
6f50747d21 Updated changelog.
See #3556
2021-03-17 10:35:15 +01:00
Mark Paluch
5cf1578ad3 Polishing.
Move hasValue(…) from DocumentAccessor to BsonUtils. Fix typo in tests.

See: #3590
Original pull request: #3591.
2021-03-15 14:03:33 +01:00
Christoph Strobl
78a59c45ca Fix ShardKey lookup for nested paths.
This commit fixes the lookup of shard key values for nested paths using the dot (.) notation.

Closes: #3590
Original pull request: #3591.
2021-03-15 14:02:49 +01:00
Christoph Strobl
dccdfc8b4d Upgrade MongoDB drivers to 4.1.2
Closes #3589
2021-03-15 09:17:52 +01:00
Mark Paluch
e48239eb8f Remove @Persistent from entity-scan include filters.
We now only scan for entities annotated with `@Document` to avoid inclusion of non-MongoDB entities. Previously, types annotated (or meta-annotated) with `@Persistent` were included as MongoDB entity which could lead to mapping rule violations.

Closes #3592
2021-03-11 15:08:05 +01:00
Christoph Strobl
c3b4f61d29 Preserve class keyword as Map key during update mapping.
This commit makes sure to skip the class property ob Object when mapping maps and their keys inside an Update.

Closes #3566
Original pull request: #3577.
2021-03-02 11:38:20 +01:00
Mark Paluch
22ed860b4a Polishing.
Reformat code. Reduce method visibility in JUnit 5 tests. Add Nullable annotations to address warnings.

See #3568
Original pull request: #3569.
2021-03-02 11:30:26 +01:00
Brice Vandeputte
bf642ad3f7 Translate MongoSocketException subclasses to DataAccessResourceFailureException.
Closes #3568
Original pull request: #3569.
2021-03-02 11:30:26 +01:00
Christoph Strobl
fcd48539ea Remove duplicate JSON Schema section from reference documentation.
Closes: #3573
Original pull request: #3574.
2021-03-01 14:42:33 +01:00
Mark Paluch
bf10f72a57 Polishing.
Simplify assertions.

See #3552.
Original pull request: #3565.
2021-02-22 09:57:03 +01:00
Christoph Strobl
1c652cce1c Preserve numeric keys that exceed Long range when mapping Updates.
This commit makes sure we preserve map keys no matter what.

Closes #3552.
Original pull request: #3565.
2021-02-22 09:57:02 +01:00
Mark Paluch
dc2de878bc Polishing.
Reformat code. Add since tags.

See #3395
Original pull request: #3554.
2021-02-18 15:08:49 +01:00
Christoph Strobl
00cacc02ac Fix case insensitive derived in queries on String properties.
We now consider the IgnoreCase part of a derived query when used along with In. Strings will be quoted to avoid malicious strings from being handed over to the server as a regular expression to evaluate.

See #3395
Original pull request: #3554.
2021-02-18 15:08:49 +01:00
Christoph Strobl
811c2e5d7b Updated changelog.
See #3560
2021-02-18 11:37:47 +01:00
Christoph Strobl
200f3006bd After release cleanups.
See #3557
2021-02-18 11:12:46 +01:00
Christoph Strobl
1d6bea51ec Prepare next development iteration.
See #3557
2021-02-18 11:12:44 +01:00
Christoph Strobl
7779ded45c Release version 3.1.5 (2020.0.5).
See #3557
2021-02-18 10:59:16 +01:00
Christoph Strobl
918bf7c138 Prepare 3.1.5 (2020.0.5).
See #3557
2021-02-18 10:58:49 +01:00
Christoph Strobl
abe3b9f6d7 Updated changelog.
See #3557
2021-02-18 10:58:42 +01:00
Christoph Strobl
41c453cc83 Updated changelog.
See #3537
2021-02-17 14:20:39 +01:00
Christoph Strobl
77784d88c7 After release cleanups.
See #3536
2021-02-17 13:41:54 +01:00
Christoph Strobl
263c62c880 Prepare next development iteration.
See #3536
2021-02-17 13:41:53 +01:00
Christoph Strobl
24ab8f67bb Release version 3.1.4 (2020.0.4).
See #3536
2021-02-17 12:00:24 +01:00
Christoph Strobl
572ceb867e Prepare 3.1.4 (2020.0.4).
See #3536
2021-02-17 11:59:37 +01:00
Christoph Strobl
b7caea8602 Updated changelog.
See #3536
2021-02-17 11:59:32 +01:00
Christoph Strobl
3696f2144f Updated changelog.
See #3520
2021-02-17 11:34:25 +01:00
Christoph Strobl
b25c8acca6 Updated changelog.
See #3519
2021-02-17 10:58:24 +01:00
Christoph Strobl
00d6271468 Fix DocumentToStringConverter UUID representation when calling toJson.
This commit makes sure to use an Encoder having UuidRepresentation set when calling org.bson.Document#toJson, preventing CodecConfigurationException from being raised.

Future versions will make sure the UUID string representation matches the Java default one.

Closes #3546.
Original pull request: #3551.
2021-02-17 07:47:49 +01:00
Christoph Strobl
bb603ba7b9 Updated reference documentation regarding GeoJsonModule.
Original pull request: #3539.
Closes #3517
2021-02-02 14:50:54 +01:00
Christoph Strobl
02eaa4cbd2 Allow access to mongoDatabaseFactory used in ReactiveMongoTemplate.
By offering a getter method for the ReactiveMongoDatabaseFactory users subclassing ReactiveMongoTemplate could evaluate the current transaction state via ReactiveMongoDatabaseUtils.isTransactionActive(getDatabaseFactory()).
This change also aligns the reactive and imperative template implementation in that regard.

Closes #3540
Original pull request: #3541.
2021-02-02 14:22:15 +01:00
Christoph Strobl
7429503c63 Update count vs. estimatedCount documentation.
Closes #3055
Original pull request: #3541.
2021-02-02 14:22:15 +01:00
Christoph Strobl
82f4e2276b Fix Criteria chaining for Criteria.alike().
This commit fixes an issue where an Example probe would not be added to the criteria chain.

Closes #3544
Original pull request: #3549.
2021-02-01 09:14:52 +01:00
Mark Paluch
e1bce7d942 Polishing.
Align type variable naming with imperative extensions(I, O). Add extension without accepting KClass. Update since tags and tests.

See #3508.
Original pull request: #893.
2021-01-27 09:57:39 +01:00
wonwoo
8bf3d395be Add missing ReactiveMongoOperations.aggregate Kotlin extension.
See #3508.
Original pull request: #893.
2021-01-27 09:57:39 +01:00
Christoph Strobl
d3c00a93c0 Update QBE Documentation section.
This commit adds a note explaining scenarios suitable for an UntypedExampleMatcher.

Closes: #3474
Original pull request: #3538.
2021-01-26 15:04:10 +01:00
Christoph Strobl
0aa805e1a2 Fix method names in full text query documentation.
Closes #3525
2021-01-20 08:29:52 +01:00
Christoph Strobl
9dc1df3deb Updated changelog.
See #3521
2021-01-13 15:49:50 +01:00
Christoph Strobl
92a73a5cc0 After release cleanups.
See #3477
2021-01-13 15:01:50 +01:00
Christoph Strobl
910d66afb0 Prepare next development iteration.
See #3477
2021-01-13 15:01:46 +01:00
46 changed files with 1365 additions and 436 deletions

24
Jenkinsfile vendored
View File

@@ -76,6 +76,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -85,7 +88,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -105,6 +108,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -114,7 +120,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -126,6 +132,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -135,7 +144,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -147,6 +156,9 @@ pipeline {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
@@ -156,7 +168,7 @@ pipeline {
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pjava11 clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pjava11 clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -185,7 +197,7 @@ pipeline {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
@@ -216,7 +228,7 @@ pipeline {
script {
docker.withRegistry('', 'hub.docker.com-springbuildmaster') {
docker.image('adoptopenjdk/openjdk8:latest').inside('-v $HOME:/tmp/jenkins-home') {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,distribute ' +
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,distribute ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +

202
LICENSE.txt Normal file
View File

@@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
https://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright {yyyy} {name of copyright owner}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

13
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.3</version>
<version>3.1.9</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.4.3</version>
<version>2.4.9</version>
</parent>
<modules>
@@ -26,8 +26,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.4.3</springdata.commons>
<mongo>4.1.1</mongo>
<springdata.commons>2.4.9</springdata.commons>
<mongo>4.1.2</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -158,11 +158,6 @@
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</pluginRepository>
<pluginRepository>
<id>bintray-plugins</id>
<name>bintray-plugins</name>
<url>https://jcenter.bintray.com</url>
</pluginRepository>
</pluginRepositories>
</project>

29
settings.xml Normal file
View File

@@ -0,0 +1,29 @@
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
https://maven.apache.org/xsd/settings-1.0.0.xsd">
<servers>
<server>
<id>spring-plugins-release</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-snapshot</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-milestone</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
<server>
<id>spring-libs-release</id>
<username>${env.ARTIFACTORY_USR}</username>
<password>${env.ARTIFACTORY_PSW}</password>
</server>
</servers>
</settings>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.3</version>
<version>3.1.9</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.3</version>
<version>3.1.9</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.3</version>
<version>3.1.9</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -26,7 +26,6 @@ import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
@@ -140,8 +139,7 @@ public abstract class MongoConfigurationSupport {
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document}.
*
* @param basePackage must not be {@literal null}.
* @return
@@ -161,7 +159,6 @@ public abstract class MongoConfigurationSupport {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {

View File

@@ -125,6 +125,11 @@ public interface ExecutableFindOperation {
/**
* Get the number of matching elements.
* <p />
* This method uses an {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions) aggregation
* execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees shard,
* session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link MongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return total number of matching elements.
*/

View File

@@ -21,6 +21,7 @@ import java.util.HashSet;
import java.util.Set;
import org.bson.BsonInvalidOperationException;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
@@ -39,6 +40,7 @@ import org.springframework.util.ClassUtils;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.MongoException;
import com.mongodb.MongoServerException;
import com.mongodb.MongoSocketException;
import com.mongodb.bulk.BulkWriteError;
/**
@@ -49,6 +51,7 @@ import com.mongodb.bulk.BulkWriteError;
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
* @author Brice Vandeputte
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
@@ -78,6 +81,10 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
}
if (ex instanceof MongoSocketException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DUPLICATE_KEY_EXCEPTIONS.contains(exception)) {

View File

@@ -1160,6 +1160,12 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -1176,6 +1182,12 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
@@ -1187,6 +1199,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return the estimated number of documents.
@@ -1200,6 +1215,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return the estimated number of documents.
@@ -1214,6 +1232,12 @@ public interface MongoOperations extends FluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.

View File

@@ -28,7 +28,6 @@ import org.bson.Document;
import org.bson.conversions.Bson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
@@ -3454,7 +3453,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
}
/**
* @deprecated since 3.1.4. Use {@link #getMongoDatabaseFactory()} instead.
* @return the {@link MongoDatabaseFactory} in use.
*/
@Deprecated
public MongoDatabaseFactory getMongoDbFactory() {
return getMongoDatabaseFactory();
}
/**
* @return the {@link MongoDatabaseFactory} in use.
* @since 3.1.4
*/
public MongoDatabaseFactory getMongoDatabaseFactory() {
return mongoDbFactory;
}

View File

@@ -658,7 +658,8 @@ class QueryOperations {
: mappedDocument != null ? mappedDocument.getDocument() : getMappedUpdate(domainType);
Document filterWithShardKey = new Document(filter);
getMappedShardKeyFields(domainType).forEach(key -> filterWithShardKey.putIfAbsent(key, shardKeySource.get(key)));
getMappedShardKeyFields(domainType)
.forEach(key -> filterWithShardKey.putIfAbsent(key, BsonUtils.resolveValue(shardKeySource, key)));
return filterWithShardKey;
}

View File

@@ -106,6 +106,12 @@ public interface ReactiveFindOperation {
/**
* Get the number of matching elements.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but
* guarantees shard, session and transaction compliance. In case an inaccurate count satisfies the applications
* needs use {@link ReactiveMongoOperations#estimatedCount(String)} for empty queries instead.
*
* @return {@link Mono} emitting total number of matching elements. Never {@literal null}.
*/

View File

@@ -940,6 +940,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(Class)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -956,6 +962,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
@@ -971,6 +983,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches.
* <p />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
* shard, session and transaction compliance. In case an inaccurate count satisfies the applications needs use
* {@link #estimatedCount(String)} for empty queries instead.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
@@ -983,6 +1001,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
@@ -996,6 +1017,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <p />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.

View File

@@ -2730,6 +2730,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return potentiallyForceAcknowledgedWrite(wc);
}
/**
* @return the {@link MongoDatabaseFactory} in use.
* @since 3.1.4
*/
public ReactiveMongoDatabaseFactory getMongoDatabaseFactory() {
return mongoDatabaseFactory;
}
@Nullable
private WriteConcern potentiallyForceAcknowledgedWrite(@Nullable WriteConcern wc) {

View File

@@ -17,17 +17,16 @@ package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.Iterator;
import java.util.Map;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
@@ -110,28 +109,7 @@ class DocumentAccessor {
*/
@Nullable
public Object get(MongoPersistentProperty property) {
String fieldName = property.getFieldName();
Map<String, Object> map = BsonUtils.asMap(document);
if (!fieldName.contains(".")) {
return map.get(fieldName);
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
Map<String, Object> source = map;
Object result = null;
while (source != null && parts.hasNext()) {
result = source.get(parts.next());
if (parts.hasNext()) {
source = getAsMap(result);
}
}
return result;
return BsonUtils.resolveValue(document, property.getFieldName());
}
/**
@@ -157,71 +135,7 @@ class DocumentAccessor {
Assert.notNull(property, "Property must not be null!");
String fieldName = property.getFieldName();
if (this.document instanceof Document) {
if (((Document) this.document).containsKey(fieldName)) {
return true;
}
} else if (this.document instanceof DBObject) {
if (((DBObject) this.document).containsField(fieldName)) {
return true;
}
}
if (!fieldName.contains(".")) {
return false;
}
String[] parts = fieldName.split("\\.");
Map<String, Object> source;
if (this.document instanceof Document) {
source = ((Document) this.document);
} else {
source = ((DBObject) this.document).toMap();
}
Object result = null;
for (int i = 1; i < parts.length; i++) {
result = source.get(parts[i - 1]);
source = getAsMap(result);
if (source == null) {
return false;
}
}
return source.containsKey(parts[parts.length - 1]);
}
/**
* Returns the given source object as map, i.e. {@link Document}s and maps as is or {@literal null} otherwise.
*
* @param source can be {@literal null}.
* @return can be {@literal null}.
*/
@Nullable
@SuppressWarnings("unchecked")
private static Map<String, Object> getAsMap(Object source) {
if (source instanceof Document) {
return (Document) source;
}
if (source instanceof BasicDBObject) {
return (BasicDBObject) source;
}
if (source instanceof Map) {
return (Map<String, Object>) source;
}
return null;
return BsonUtils.hasValue(document, property.getFieldName());
}
/**

View File

@@ -75,6 +75,7 @@ import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -182,6 +183,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* any translation but rather reject a {@link Map} with keys containing dots causing the conversion for the entire
* object to fail. If further customization of the translation is needed, have a look at
* {@link #potentiallyEscapeMapKey(String)} as well as {@link #potentiallyUnescapeMapKey(String)}.
* <p>
* {@code mapKeyDotReplacement} is used as-is during replacement operations without further processing (i.e. regex or
* normalization).
*
* @param mapKeyDotReplacement the mapKeyDotReplacement to set. Can be {@literal null}.
*/
@@ -900,7 +904,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
source));
}
return source.replaceAll("\\.", mapKeyDotReplacement);
return StringUtils.replace(source, ".", mapKeyDotReplacement);
}
/**
@@ -928,7 +932,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @return
*/
protected String potentiallyUnescapeMapKey(String source) {
return mapKeyDotReplacement == null ? source : source.replaceAll(mapKeyDotReplacement, "\\.");
return mapKeyDotReplacement == null ? source : StringUtils.replace(source, mapKeyDotReplacement, ".");
}
/**

View File

@@ -32,6 +32,9 @@ import java.util.concurrent.atomic.AtomicLong;
import org.bson.BsonTimestamp;
import org.bson.Document;
import org.bson.UuidRepresentation;
import org.bson.codecs.Codec;
import org.bson.internal.CodecRegistryHelper;
import org.bson.types.Binary;
import org.bson.types.Code;
import org.bson.types.Decimal128;
@@ -45,11 +48,12 @@ import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoClientSettings;
/**
* Wrapper class to contain useful converters for the usage with Mongo.
*
@@ -236,9 +240,13 @@ abstract class MongoConverters {
INSTANCE;
private final Codec<Document> codec = CodecRegistryHelper
.createRegistry(MongoClientSettings.getDefaultCodecRegistry(), UuidRepresentation.JAVA_LEGACY)
.get(Document.class);
@Override
public String convert(Document source) {
return source.toJson();
return source.toJson(codec);
}
}

View File

@@ -1086,8 +1086,8 @@ public class QueryMapper {
removePlaceholders(DOT_POSITIONAL_PATTERN, pathExpression));
if (sourceProperty != null && sourceProperty.getOwner().equals(entity)) {
return mappingContext
.getPersistentPropertyPath(PropertyPath.from(sourceProperty.getName(), entity.getTypeInformation()));
return mappingContext.getPersistentPropertyPath(
PropertyPath.from(Pattern.quote(sourceProperty.getName()), entity.getTypeInformation()));
}
PropertyPath path = forName(rawPath);
@@ -1146,13 +1146,21 @@ public class QueryMapper {
return forName(path.substring(0, path.length() - 3) + "id");
}
// Ok give it another try quoting
try {
return PropertyPath.from(Pattern.quote(path), entity.getTypeInformation());
} catch (PropertyReferenceException | InvalidPersistentPropertyPath ex) {
}
return null;
}
}
private boolean isPathToJavaLangClassProperty(PropertyPath path) {
if (path.getType().equals(Class.class) && path.getLeafProperty().getOwningType().getType().equals(Class.class)) {
if ((path.getType() == Class.class || path.getType().equals(Object.class))
&& path.getLeafProperty().getType() == Class.class) {
return true;
}
return false;
@@ -1261,9 +1269,9 @@ public class QueryMapper {
String partial = iterator.next();
boolean isPositional = (isPositionalParameter(partial) && (property.isMap() || property.isCollectionLike()));
boolean isPositional = isPositionalParameter(partial) && property.isCollectionLike();
if (isPositional) {
if (isPositional || property.isMap()) {
mappedName.append(".").append(partial);
}

View File

@@ -20,8 +20,10 @@ import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
@@ -58,6 +60,7 @@ import com.mongodb.BasicDBList;
* @author Christoph Strobl
* @author Mark Paluch
* @author Andreas Zink
* @author Clément Petit
*/
public class Criteria implements CriteriaDefinition {
@@ -124,7 +127,12 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Static factory method to create a {@link Criteria} matching an example object.
* Static factory method to create a {@link Criteria} matching an example object. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125; </code>. <br />
* To avoid the above mentioned type restriction use an {@link UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @param example must not be {@literal null}.
* @return new instance of {@link Criteria}.
@@ -615,8 +623,15 @@ public class Criteria implements CriteriaDefinition {
*/
public Criteria alike(Example<?> sample) {
criteria.put("$example", sample);
return this;
if (StringUtils.hasText(this.getKey())) {
criteria.put("$example", sample);
return this;
}
Criteria exampleCriteria = new Criteria();
exampleCriteria.criteria.put("$example", sample);
return registerCriteriaChainElement(exampleCriteria);
}
/**
@@ -883,15 +898,15 @@ public class Criteria implements CriteriaDefinition {
* @param right
* @return
*/
private boolean isEqual(Object left, Object right) {
private boolean isEqual(@Nullable Object left, @Nullable Object right) {
if (left == null) {
return right == null;
}
if (Pattern.class.isInstance(left)) {
if (left instanceof Pattern) {
if (!Pattern.class.isInstance(right)) {
if (!(right instanceof Pattern)) {
return false;
}
@@ -902,6 +917,52 @@ public class Criteria implements CriteriaDefinition {
&& leftPattern.flags() == rightPattern.flags();
}
if (left instanceof Document) {
if (!(right instanceof Document)) {
return false;
}
Document leftDocument = (Document) left;
Document rightDocument = (Document) right;
Iterator<Entry<String, Object>> leftIterator = leftDocument.entrySet().iterator();
Iterator<Entry<String, Object>> rightIterator = rightDocument.entrySet().iterator();
while (leftIterator.hasNext() && rightIterator.hasNext()) {
Map.Entry<String, Object> leftEntry = leftIterator.next();
Map.Entry<String, Object> rightEntry = rightIterator.next();
if (!isEqual(leftEntry.getKey(), rightEntry.getKey())
|| !isEqual(leftEntry.getValue(), rightEntry.getValue())) {
return false;
}
}
return !leftIterator.hasNext() && !rightIterator.hasNext();
}
if (Collection.class.isAssignableFrom(left.getClass())) {
if (!Collection.class.isAssignableFrom(right.getClass())) {
return false;
}
Collection<?> leftCollection = (Collection<?>) left;
Collection<?> rightCollection = (Collection<?>) right;
Iterator<?> leftIterator = leftCollection.iterator();
Iterator<?> rightIterator = rightCollection.iterator();
while (leftIterator.hasNext() && rightIterator.hasNext()) {
if (!isEqual(leftIterator.next(), rightIterator.next())) {
return false;
}
}
return !leftIterator.hasNext() && !rightIterator.hasNext();
}
return ObjectUtils.nullSafeEquals(left, right);
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.query;
import java.util.regex.Pattern;
import org.bson.BsonRegularExpression;
import org.springframework.lang.Nullable;
/**
@@ -102,6 +103,15 @@ public enum MongoRegexCreator {
}
}
/**
* @param source
* @return
* @since 2.2.14
*/
public Object toCaseInsensitiveMatch(Object source) {
return source instanceof String ? new BsonRegularExpression(Pattern.quote((String) source), "i") : source;
}
private String prepareAndEscapeStringBeforeApplyingLikeRegex(String source, MatchMode matcherType) {
if (MatchMode.REGEX == matcherType) {

View File

@@ -78,16 +78,31 @@ public interface MongoRepository<T, ID> extends PagingAndSortingRepository<T, ID
*/
<S extends T> List<S> insert(Iterable<S> entities);
/*
* (non-Javadoc)
/**
* Returns all entities matching the given {@link Example}. In case no match could be found an empty {@link List} is
* returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example)
*/
@Override
<S extends T> List<S> findAll(Example<S> example);
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example, org.springframework.data.domain.Sort)
/**
* Returns all entities matching the given {@link Example} applying the given {@link Sort}. In case no match could be
* found an empty {@link List} is returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example,
* org.springframework.data.domain.Sort)
*/
@Override
<S extends T> List<S> findAll(Example<S> example, Sort sort);

View File

@@ -64,16 +64,33 @@ public interface ReactiveMongoRepository<T, ID> extends ReactiveSortingRepositor
*/
<S extends T> Flux<S> insert(Publisher<S> entities);
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example)
/**
* Returns all entities matching the given {@link Example}. In case no match could be found an empty {@link Flux} is
* returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#findAll(org.springframework.data.domain.Example)
*/
@Override
<S extends T> Flux<S> findAll(Example<S> example);
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryByExampleExecutor#findAll(org.springframework.data.domain.Example, org.springframework.data.domain.Sort)
/**
* Returns all entities matching the given {@link Example} applying the given {@link Sort}. In case no match could be
* found an empty {@link Flux} is returned. <br />
* By default the {@link Example} uses typed matching restricting it to probe assignable types. For example, when
* sticking with the default type key ({@code _class}), the query has restrictions such as
* <code>_class : &#123; $in : [com.acme.Person] &#125;</code>. <br />
* To avoid the above mentioned type restriction use an {@link org.springframework.data.mongodb.core.query.UntypedExampleMatcher} with
* {@link Example#of(Object, org.springframework.data.domain.ExampleMatcher)}.
*
* @see org.springframework.data.repository.query.ReactiveQueryByExampleExecutor#findAll(org.springframework.data.domain.Example,
* org.springframework.data.domain.Sort)
*/
@Override
<S extends T> Flux<S> findAll(Example<S> example, Sort sort);
}

View File

@@ -33,6 +33,7 @@ import org.springframework.data.repository.query.QueryMethodEvaluationContextPro
import org.springframework.expression.ExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
@@ -163,9 +164,9 @@ abstract class AggregationUtils {
* @throws IllegalArgumentException when none of the above rules is met.
*/
@Nullable
static <T> T extractSimpleTypeResult(Document source, Class<T> targetType, MongoConverter converter) {
static <T> T extractSimpleTypeResult(@Nullable Document source, Class<T> targetType, MongoConverter converter) {
if (source.isEmpty()) {
if (ObjectUtils.isEmpty(source)) {
return null;
}

View File

@@ -25,7 +25,6 @@ import java.util.regex.Pattern;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.domain.Range;
import org.springframework.data.domain.Range.Bound;
import org.springframework.data.domain.Sort;
@@ -51,8 +50,10 @@ import org.springframework.data.repository.query.parser.Part;
import org.springframework.data.repository.query.parser.Part.IgnoreCaseType;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.data.util.Streamable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
/**
* Custom query creator to create Mongo criterias.
@@ -196,9 +197,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case IS_NULL:
return criteria.is(null);
case NOT_IN:
return criteria.nin(nextAsArray(parameters));
return criteria.nin(nextAsList(parameters, part));
case IN:
return criteria.in(nextAsArray(parameters));
return criteria.in(nextAsList(parameters, part));
case LIKE:
case STARTING_WITH:
case ENDING_WITH:
@@ -337,7 +338,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
Iterator<Object> parameters) {
if (property.isCollectionLike()) {
return criteria.in(nextAsArray(parameters));
return criteria.in(nextAsList(parameters, part));
}
return addAppropriateLikeRegexTo(criteria, part, parameters.next());
@@ -400,17 +401,24 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
String.format("Expected parameter type of %s but got %s!", type, parameter.getClass()));
}
private Object[] nextAsArray(Iterator<Object> iterator) {
private java.util.List<?> nextAsList(Iterator<Object> iterator, Part part) {
Object next = iterator.next();
if (next instanceof Collection) {
return ((Collection<?>) next).toArray();
} else if (next != null && next.getClass().isArray()) {
return (Object[]) next;
Streamable<?> streamable = asStreamable(iterator.next());
if (!isSimpleComparisionPossible(part)) {
streamable = streamable.map(MongoRegexCreator.INSTANCE::toCaseInsensitiveMatch);
}
return new Object[] { next };
return streamable.toList();
}
private Streamable<?> asStreamable(Object value) {
if (value instanceof Collection) {
return Streamable.of((Collection<?>) value);
} else if (ObjectUtils.isArray(value)) {
return Streamable.of((Object[]) value);
}
return Streamable.of(value);
}
private String toLikeRegex(String source, Part part) {

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.repository.support;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import java.util.Optional;
@@ -215,7 +216,7 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
Assert.notNull(ids, "The given Ids of entities not be null!");
return findAll(new Query(new Criteria(entityInformation.getIdAttribute())
.in(Streamable.of(ids).stream().collect(StreamUtils.toUnmodifiableList()))));
.in(toCollection(ids))));
}
/*
@@ -266,10 +267,10 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
Assert.notNull(entities, "The given Iterable of entities not be null!");
List<S> list = Streamable.of(entities).stream().collect(StreamUtils.toUnmodifiableList());
Collection<S> list = toCollection(entities);
if (list.isEmpty()) {
return list;
return Collections.emptyList();
}
return new ArrayList<>(mongoOperations.insertAll(list));
@@ -374,6 +375,11 @@ public class SimpleMongoRepository<T, ID> implements MongoRepository<T, ID> {
return where(entityInformation.getIdAttribute()).is(id);
}
private static <E> Collection<E> toCollection(Iterable<E> ids) {
return ids instanceof Collection ? (Collection<E>) ids
: StreamUtils.createStreamFromIterator(ids.iterator()).collect(Collectors.toList());
}
private List<T> findAll(@Nullable Query query) {
if (query == null) {

View File

@@ -21,10 +21,11 @@ import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.io.Serializable;
import java.util.List;
import java.util.Collection;
import java.util.stream.Collectors;
import org.reactivestreams.Publisher;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.dao.OptimisticLockingFailureException;
import org.springframework.data.domain.Example;
@@ -47,6 +48,7 @@ import com.mongodb.client.result.DeleteResult;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Ruben J Garcia
* @author Clément Petit
* @since 2.0
*/
public class SimpleReactiveMongoRepository<T, ID extends Serializable> implements ReactiveMongoRepository<T, ID> {
@@ -173,7 +175,7 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
Assert.notNull(ids, "The given Iterable of Id's must not be null!");
return findAll(new Query(new Criteria(entityInformation.getIdAttribute())
.in(Streamable.of(ids).stream().collect(StreamUtils.toUnmodifiableList()))));
.in(toCollection(ids))));
}
/*
@@ -274,9 +276,9 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
Assert.notNull(entities, "The given Iterable of entities must not be null!");
List<S> source = Streamable.of(entities).stream().collect(StreamUtils.toUnmodifiableList());
Collection<S> source = toCollection(entities);
return source.isEmpty() ? Flux.empty() : Flux.from(mongoOperations.insertAll(source));
return source.isEmpty() ? Flux.empty() : mongoOperations.insertAll(source);
}
/*
@@ -333,8 +335,8 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
Assert.notNull(entityStream, "The given Publisher of entities must not be null!");
return Flux.from(entityStream).flatMap(entity -> entityInformation.isNew(entity) ? //
mongoOperations.insert(entity, entityInformation.getCollectionName()).then(Mono.just(entity)) : //
mongoOperations.save(entity, entityInformation.getCollectionName()).then(Mono.just(entity)));
mongoOperations.insert(entity, entityInformation.getCollectionName()) : //
mongoOperations.save(entity, entityInformation.getCollectionName()));
}
/*
@@ -436,8 +438,12 @@ public class SimpleReactiveMongoRepository<T, ID extends Serializable> implement
return where(entityInformation.getIdAttribute()).is(id);
}
private Flux<T> findAll(Query query) {
private static <E> Collection<E> toCollection(Iterable<E> ids) {
return ids instanceof Collection ? (Collection<E>) ids
: StreamUtils.createStreamFromIterator(ids.iterator()).collect(Collectors.toList());
}
private Flux<T> findAll(Query query) {
return mongoOperations.find(query, entityInformation.getJavaType(), entityInformation.getCollectionName());
}
}

View File

@@ -282,6 +282,109 @@ public class BsonUtils {
.orElseGet(() -> new DocumentCodec(codecRegistryProvider.getCodecRegistry())));
}
/**
* Resolve a the value for a given key. If the given {@link Bson} value contains the key the value is immediately
* returned. If not and the key contains a path using the dot ({@code .}) notation it will try to resolve the path by
* inspecting the individual parts. If one of the intermediate ones is {@literal null} or cannot be inspected further
* (wrong) type, {@literal null} is returned.
*
* @param bson the source to inspect. Must not be {@literal null}.
* @param key the key to lookup. Must not be {@literal null}.
* @return can be {@literal null}.
* @since 3.0.8
*/
@Nullable
public static Object resolveValue(Bson bson, String key) {
Map<String, Object> source = asMap(bson);
if (source.containsKey(key) || !key.contains(".")) {
return source.get(key);
}
String[] parts = key.split("\\.");
for (int i = 1; i < parts.length; i++) {
Object result = source.get(parts[i - 1]);
if (!(result instanceof Bson)) {
return null;
}
source = asMap((Bson) result);
}
return source.get(parts[parts.length - 1]);
}
/**
* Returns whether the underlying {@link Bson bson} has a value ({@literal null} or non-{@literal null}) for the given
* {@code key}.
*
* @param bson the source to inspect. Must not be {@literal null}.
* @param key the key to lookup. Must not be {@literal null}.
* @return {@literal true} if no non {@literal null} value present.
* @since 3.0.8
*/
public static boolean hasValue(Bson bson, String key) {
Map<String, Object> source = asMap(bson);
if (source.get(key) != null) {
return true;
}
if (!key.contains(".")) {
return false;
}
String[] parts = key.split("\\.");
Object result;
for (int i = 1; i < parts.length; i++) {
result = source.get(parts[i - 1]);
source = getAsMap(result);
if (source == null) {
return false;
}
}
return source.containsKey(parts[parts.length - 1]);
}
/**
* Returns the given source object as map, i.e. {@link Document}s and maps as is or {@literal null} otherwise.
*
* @param source can be {@literal null}.
* @return can be {@literal null}.
*/
@Nullable
@SuppressWarnings("unchecked")
private static Map<String, Object> getAsMap(Object source) {
if (source instanceof Document) {
return (Document) source;
}
if (source instanceof BasicDBObject) {
return (BasicDBObject) source;
}
if (source instanceof DBObject) {
return ((DBObject) source).toMap();
}
if (source instanceof Map) {
return (Map<String, Object>) source;
}
return null;
}
@Nullable
private static String toJson(@Nullable Object value) {

View File

@@ -20,6 +20,8 @@ import com.mongodb.client.result.UpdateResult
import com.mongodb.reactivestreams.client.MongoCollection
import org.bson.Document
import org.springframework.data.geo.GeoResult
import org.springframework.data.mongodb.core.aggregation.Aggregation
import org.springframework.data.mongodb.core.aggregation.TypedAggregation
import org.springframework.data.mongodb.core.index.ReactiveIndexOperations
import org.springframework.data.mongodb.core.query.NearQuery
import org.springframework.data.mongodb.core.query.Query
@@ -210,6 +212,52 @@ inline fun <reified T : Any, reified E : Any> ReactiveMongoOperations.findDistin
if (collectionName != null) findDistinct(query, field, collectionName, E::class.java, T::class.java)
else findDistinct(query, field, E::class.java, T::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @since 3.1.4
*/
inline fun <reified O : Any> ReactiveMongoOperations.aggregate(
aggregation: TypedAggregation<*>,
collectionName: String
): Flux<O> =
this.aggregate(aggregation, collectionName, O::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @since 3.1.4
*/
inline fun <reified O : Any> ReactiveMongoOperations.aggregate(aggregation: TypedAggregation<*>): Flux<O> =
this.aggregate(aggregation, O::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @author Mark Paluch
* @since 3.1.4
*/
inline fun <reified I : Any, reified O : Any> ReactiveMongoOperations.aggregate(
aggregation: Aggregation
): Flux<O> =
this.aggregate(aggregation, I::class.java, O::class.java)
/**
* Extension for [ReactiveMongoOperations.aggregate] leveraging reified type parameters.
*
* @author Wonwoo Lee
* @since 3.1.4
*/
inline fun <reified O : Any> ReactiveMongoOperations.aggregate(
aggregation: Aggregation,
collectionName: String
): Flux<O> =
this.aggregate(aggregation, collectionName, O::class.java)
/**
* Extension for [ReactiveMongoOperations.geoNear] leveraging reified type parameters.
*

View File

@@ -17,8 +17,6 @@ package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import java.net.UnknownHostException;
import org.bson.BsonDocument;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
@@ -32,11 +30,14 @@ import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.data.mongodb.ClientSessionException;
import org.springframework.data.mongodb.MongoTransactionException;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.lang.Nullable;
import com.mongodb.MongoCursorNotFoundException;
import com.mongodb.MongoException;
import com.mongodb.MongoInternalException;
import com.mongodb.MongoSocketException;
import com.mongodb.MongoSocketReadTimeoutException;
import com.mongodb.MongoSocketWriteException;
import com.mongodb.ServerAddress;
/**
@@ -45,18 +46,20 @@ import com.mongodb.ServerAddress;
* @author Michal Vich
* @author Oliver Gierke
* @author Christoph Strobl
* @author Brice Vandeputte
*/
public class MongoExceptionTranslatorUnitTests {
class MongoExceptionTranslatorUnitTests {
MongoExceptionTranslator translator;
private static final String EXCEPTION_MESSAGE = "IOException";
private MongoExceptionTranslator translator;
@BeforeEach
public void setUp() {
void setUp() {
translator = new MongoExceptionTranslator();
}
@Test
public void translateDuplicateKey() {
void translateDuplicateKey() {
expectExceptionWithCauseMessage(
translator.translateExceptionIfPossible(
@@ -64,17 +67,33 @@ public class MongoExceptionTranslatorUnitTests {
DuplicateKeyException.class, null);
}
@Test
public void translateSocketException() {
@Test // GH-3568
void translateSocketException() {
expectExceptionWithCauseMessage(
translator.translateExceptionIfPossible(new MongoSocketException("IOException", new ServerAddress())),
DataAccessResourceFailureException.class, "IOException");
translator.translateExceptionIfPossible(new MongoSocketException(EXCEPTION_MESSAGE, new ServerAddress())),
DataAccessResourceFailureException.class, EXCEPTION_MESSAGE);
}
@Test // GH-3568
void translateSocketExceptionSubclasses() {
expectExceptionWithCauseMessage(
translator.translateExceptionIfPossible(
new MongoSocketWriteException("intermediate message", new ServerAddress(), new Exception(EXCEPTION_MESSAGE))
),
DataAccessResourceFailureException.class, EXCEPTION_MESSAGE);
expectExceptionWithCauseMessage(
translator.translateExceptionIfPossible(
new MongoSocketReadTimeoutException("intermediate message", new ServerAddress(), new Exception(EXCEPTION_MESSAGE))
),
DataAccessResourceFailureException.class, EXCEPTION_MESSAGE);
}
@Test
public void translateCursorNotFound() throws UnknownHostException {
void translateCursorNotFound() {
expectExceptionWithCauseMessage(
translator.translateExceptionIfPossible(new MongoCursorNotFoundException(1L, new ServerAddress())),
@@ -82,21 +101,21 @@ public class MongoExceptionTranslatorUnitTests {
}
@Test
public void translateToDuplicateKeyException() {
void translateToDuplicateKeyException() {
checkTranslatedMongoException(DuplicateKeyException.class, 11000);
checkTranslatedMongoException(DuplicateKeyException.class, 11001);
}
@Test
public void translateToDataAccessResourceFailureException() {
void translateToDataAccessResourceFailureException() {
checkTranslatedMongoException(DataAccessResourceFailureException.class, 12000);
checkTranslatedMongoException(DataAccessResourceFailureException.class, 13440);
}
@Test
public void translateToInvalidDataAccessApiUsageException() {
void translateToInvalidDataAccessApiUsageException() {
checkTranslatedMongoException(InvalidDataAccessApiUsageException.class, 10003);
checkTranslatedMongoException(InvalidDataAccessApiUsageException.class, 12001);
@@ -106,7 +125,7 @@ public class MongoExceptionTranslatorUnitTests {
}
@Test
public void translateToUncategorizedMongoDbException() {
void translateToUncategorizedMongoDbException() {
MongoException exception = new MongoException(0, "");
DataAccessException translatedException = translator.translateExceptionIfPossible(exception);
@@ -115,7 +134,7 @@ public class MongoExceptionTranslatorUnitTests {
}
@Test
public void translateMongoInternalException() {
void translateMongoInternalException() {
MongoInternalException exception = new MongoInternalException("Internal exception");
DataAccessException translatedException = translator.translateExceptionIfPossible(exception);
@@ -124,14 +143,14 @@ public class MongoExceptionTranslatorUnitTests {
}
@Test
public void translateUnsupportedException() {
void translateUnsupportedException() {
RuntimeException exception = new RuntimeException();
assertThat(translator.translateExceptionIfPossible(exception)).isNull();
}
@Test // DATAMONGO-2045
public void translateSessionExceptions() {
void translateSessionExceptions() {
checkTranslatedMongoException(ClientSessionException.class, 206);
checkTranslatedMongoException(ClientSessionException.class, 213);
@@ -140,7 +159,7 @@ public class MongoExceptionTranslatorUnitTests {
}
@Test // DATAMONGO-2045
public void translateTransactionExceptions() {
void translateTransactionExceptions() {
checkTranslatedMongoException(MongoTransactionException.class, 217);
checkTranslatedMongoException(MongoTransactionException.class, 225);
@@ -163,13 +182,13 @@ public class MongoExceptionTranslatorUnitTests {
assertThat(((MongoException) cause).getCode()).isEqualTo(code);
}
private static void expectExceptionWithCauseMessage(NestedRuntimeException e,
private static void expectExceptionWithCauseMessage(@Nullable NestedRuntimeException e,
Class<? extends NestedRuntimeException> type) {
expectExceptionWithCauseMessage(e, type, null);
}
private static void expectExceptionWithCauseMessage(NestedRuntimeException e,
Class<? extends NestedRuntimeException> type, String message) {
private static void expectExceptionWithCauseMessage(@Nullable NestedRuntimeException e,
Class<? extends NestedRuntimeException> type, @Nullable String message) {
assertThat(e).isInstanceOf(type);

View File

@@ -84,6 +84,7 @@ import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.Sharded;
import org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertCallback;
import org.springframework.data.mongodb.core.mapping.event.AfterSaveCallback;
@@ -1922,6 +1923,24 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
verify(findIterable, never()).first();
}
@Test // GH-3590
void shouldIncludeValueFromNestedShardKeyPath() {
WithShardKeyPointingToNested source = new WithShardKeyPointingToNested();
source.id = "id-1";
source.value = "v1";
source.nested = new WithNamedFields();
source.nested.customName = "cname";
source.nested.name = "name";
template.save(source);
ArgumentCaptor<Bson> filter = ArgumentCaptor.forClass(Bson.class);
verify(collection).replaceOne(filter.capture(), any(), any());
assertThat(filter.getValue()).isEqualTo(new Document("_id", "id-1").append("value", "v1").append("nested.custom-named-field", "cname"));
}
@Test // DATAMONGO-2341
void saveShouldProjectOnShardKeyWhenLoadingExistingDocument() {
@@ -2267,6 +2286,13 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Field("firstname") String name;
}
@Sharded(shardKey = {"value", "nested.customName"})
static class WithShardKeyPointingToNested {
String id;
String value;
WithNamedFields nested;
}
/**
* Mocks out the {@link MongoTemplate#getDb()} method to return the {@link DB} mock instead of executing the actual
* behaviour.

View File

@@ -139,16 +139,14 @@ public class QueryByExampleTests {
assertThat(result).containsExactlyInAnyOrder(p1, p2, p3);
}
@Test // DATAMONGO-1245
@Test // DATAMONGO-1245, GH-3544
public void findByExampleWithCriteria() {
Person sample = new Person();
sample.lastname = "stark";
Query query = new Query(new Criteria().alike(Example.of(sample)).and("firstname").regex("^ary*"));
List<Person> result = operations.find(query, Person.class);
assertThat(result).hasSize(1);
Query query = new Query(new Criteria().alike(Example.of(sample)).and("firstname").regex(".*n.*"));
assertThat(operations.find(query, Person.class)).containsExactly(p1);
}
@Test // DATAMONGO-1459

View File

@@ -30,6 +30,8 @@ import java.time.LocalDateTime;
import java.time.temporal.ChronoUnit;
import java.util.*;
import javax.persistence.metamodel.EmbeddableType;
import org.assertj.core.api.Assertions;
import org.bson.types.Code;
import org.bson.types.Decimal128;
@@ -2179,6 +2181,15 @@ public class MappingMongoConverterUnitTests {
assertThat(((LinkedHashMap) result.get("cluster")).get("_id")).isEqualTo(100L);
}
@Test // GH-3546
void readFlattensNestedDocumentToStringIfNecessary() {
org.bson.Document source = new org.bson.Document("street", new org.bson.Document("json", "string").append("_id", UUID.randomUUID()));
Address target = converter.read(Address.class, source);
assertThat(target.street).isNotNull();
}
static class GenericType<T> {
T content;
}

View File

@@ -771,6 +771,20 @@ public class QueryMapperUnitTests {
assertThat(document).containsEntry("legacyPoint.y", 20D);
}
@Test // GH-3544
void exampleWithCombinedCriteriaShouldBeMappedCorrectly() {
Foo probe = new Foo();
probe.embedded = new EmbeddedClass();
probe.embedded.id = "conflux";
Query query = query(byExample(probe).and("listOfItems").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Foo.class));
assertThat(document).containsEntry("embedded\\._id", "conflux").containsEntry("my_items",
new org.bson.Document("$exists", true));
}
@Test // DATAMONGO-1988
void mapsStringObjectIdRepresentationToObjectIdWhenReferencingIdProperty() {
@@ -998,6 +1012,76 @@ public class QueryMapperUnitTests {
assertThat(target).isEqualTo(org.bson.Document.parse("{\"$text\" : { \"$search\" : \"test\" }}"));
}
@Test // GH-3601
void resolvesFieldnameWithUnderscoresCorrectly() {
Query query = query(where("fieldname_with_underscores").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(WithPropertyUsingUnderscoreInName.class));
assertThat(document)
.isEqualTo(new org.bson.Document("fieldname_with_underscores", new org.bson.Document("$exists", true)));
}
@Test // GH-3601
void resolvesMappedFieldnameWithUnderscoresCorrectly() {
Query query = query(where("renamed_fieldname_with_underscores").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(WithPropertyUsingUnderscoreInName.class));
assertThat(document).isEqualTo(new org.bson.Document("renamed", new org.bson.Document("$exists", true)));
}
@Test // GH-3601
void resolvesSimpleNestedFieldnameWithUnderscoresCorrectly() {
Query query = query(where("simple.fieldname_with_underscores").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(WrapperAroundWithPropertyUsingUnderscoreInName.class));
assertThat(document)
.isEqualTo(new org.bson.Document("simple.fieldname_with_underscores", new org.bson.Document("$exists", true)));
}
@Test // GH-3601
void resolvesSimpleNestedMappedFieldnameWithUnderscoresCorrectly() {
Query query = query(where("simple.renamed_fieldname_with_underscores").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(WrapperAroundWithPropertyUsingUnderscoreInName.class));
assertThat(document).isEqualTo(new org.bson.Document("simple.renamed", new org.bson.Document("$exists", true)));
}
@Test // GH-3601
void resolvesFieldNameWithUnderscoreOnNestedFieldnameWithUnderscoresCorrectly() {
Query query = query(where("double_underscore.fieldname_with_underscores").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(WrapperAroundWithPropertyUsingUnderscoreInName.class));
assertThat(document).isEqualTo(
new org.bson.Document("double_underscore.fieldname_with_underscores", new org.bson.Document("$exists", true)));
}
@Test // GH-3601
void resolvesFieldNameWithUnderscoreOnNestedMappedFieldnameWithUnderscoresCorrectly() {
Query query = query(where("double_underscore.renamed_fieldname_with_underscores").exists(true));
org.bson.Document document = mapper.getMappedObject(query.getQueryObject(),
context.getPersistentEntity(WrapperAroundWithPropertyUsingUnderscoreInName.class));
assertThat(document)
.isEqualTo(new org.bson.Document("double_underscore.renamed", new org.bson.Document("$exists", true)));
}
class WithDeepArrayNesting {
List<WithNestedArray> level0;
@@ -1181,4 +1265,17 @@ public class QueryMapperUnitTests {
this.value = value;
}
}
static class WrapperAroundWithPropertyUsingUnderscoreInName {
WithPropertyUsingUnderscoreInName simple;
WithPropertyUsingUnderscoreInName double_underscore;
}
static class WithPropertyUsingUnderscoreInName {
String fieldname_with_underscores;
@Field("renamed") String renamed_fieldname_with_underscores;
}
}

View File

@@ -1089,6 +1089,38 @@ class UpdateMapperUnitTests {
assertThat(mappedUpdate).isEqualTo(new Document("$set", new Document("aliased.$[element].value", 10)));
}
@Test // GH-3552
void numericKeyForMap() {
Update update = new Update().set("map.601218778970110001827396", "testing");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObjectMap.class));
assertThat(mappedUpdate).isEqualTo("{\"$set\": {\"map.601218778970110001827396\": \"testing\"}}");
}
@Test // GH-3552
void numericKeyInMapOfNestedPath() {
Update update = new Update().set("map.601218778970110001827396.value", "testing");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObjectMap.class));
assertThat(mappedUpdate)
.isEqualTo("{\"$set\": {\"map.601218778970110001827396.value\": \"testing\"}}");
}
@Test // GH-3566
void mapsObjectClassPropertyFieldInMapValueTypeAsKey() {
Update update = new Update().set("map.class", "value");
Document mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(EntityWithObjectMap.class));
assertThat(mappedUpdate)
.isEqualTo("{\"$set\": {\"map.class\": \"value\"}}");
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}

View File

@@ -34,6 +34,8 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Andreas Zink
* @author Clément Petit
* @author Mark Paluch
*/
public class CriteriaUnitTests {
@@ -310,9 +312,72 @@ public class CriteriaUnitTests {
@Test // DATAMONGO-2002
public void shouldEqualForSamePattern() {
Criteria left = new Criteria("field").regex("foo");
Criteria right = new Criteria("field").regex("foo");
assertThat(left).isEqualTo(right);
}
@Test // DATAMONGO-2002
public void shouldEqualForDocument() {
assertThat(new Criteria("field").is(new Document("one", 1).append("two", "two").append("null", null)))
.isEqualTo(new Criteria("field").is(new Document("one", 1).append("two", "two").append("null", null)));
assertThat(new Criteria("field").is(new Document("one", 1).append("two", "two").append("null", null)))
.isNotEqualTo(new Criteria("field").is(new Document("one", 1).append("two", "two")));
assertThat(new Criteria("field").is(new Document("one", 1).append("two", "two")))
.isNotEqualTo(new Criteria("field").is(new Document("one", 1).append("two", "two").append("null", null)));
assertThat(new Criteria("field").is(new Document("one", 1).append("null", null).append("two", "two")))
.isNotEqualTo(new Criteria("field").is(new Document("one", 1).append("two", "two").append("null", null)));
assertThat(new Criteria("field").is(new Document())).isNotEqualTo(new Criteria("field").is("foo"));
assertThat(new Criteria("field").is("foo")).isNotEqualTo(new Criteria("field").is(new Document()));
}
@Test // DATAMONGO-2002
public void shouldEqualForCollection() {
assertThat(new Criteria("field").is(Arrays.asList("foo", "bar")))
.isEqualTo(new Criteria("field").is(Arrays.asList("foo", "bar")));
assertThat(new Criteria("field").is(Arrays.asList("foo", 1)))
.isNotEqualTo(new Criteria("field").is(Arrays.asList("foo", "bar")));
assertThat(new Criteria("field").is(Collections.singletonList("foo")))
.isNotEqualTo(new Criteria("field").is(Arrays.asList("foo", "bar")));
assertThat(new Criteria("field").is(Arrays.asList("foo", "bar")))
.isNotEqualTo(new Criteria("field").is(Collections.singletonList("foo")));
assertThat(new Criteria("field").is(Arrays.asList("foo", "bar"))).isNotEqualTo(new Criteria("field").is("foo"));
assertThat(new Criteria("field").is("foo")).isNotEqualTo(new Criteria("field").is(Arrays.asList("foo", "bar")));
}
@Test // GH-3414
public void shouldEqualForSamePatternAndFlags() {
Criteria left = new Criteria("field").regex("foo", "iu");
Criteria right = new Criteria("field").regex("foo");
assertThat(left).isNotEqualTo(right);
}
@Test // GH-3414
public void shouldEqualForNestedPattern() {
Criteria left = new Criteria("a").orOperator(
new Criteria("foo").regex("value", "i"),
new Criteria("bar").regex("value")
);
Criteria right = new Criteria("a").orOperator(
new Criteria("foo").regex("value", "i"),
new Criteria("bar").regex("value")
);
assertThat(left).isEqualTo(right);
}
}

View File

@@ -1363,4 +1363,19 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
assertThat(repository.findWithSpelByFirstnameForSpELExpressionWithParameterIndexOnly("Dave")).containsExactly(dave);
assertThat(repository.findWithSpelByFirstnameForSpELExpressionWithParameterIndexOnly("Carter")).containsExactly(carter);
}
@Test // GH-3395
void caseInSensitiveInClause() {
assertThat(repository.findByLastnameIgnoreCaseIn("bEAuFoRd", "maTTheWs")).hasSize(3);
}
@Test // GH-3395
void caseInSensitiveInClauseQuotesExpressions() {
assertThat(repository.findByLastnameIgnoreCaseIn(".*")).isEmpty();
}
@Test // GH-3395
void caseSensitiveInClauseIgnoresExpressions() {
assertThat(repository.findByFirstnameIn(".*")).isEmpty();
}
}

View File

@@ -125,6 +125,8 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
@Query("{ 'lastname' : { '$regex' : '?0', '$options' : 'i'}}")
Page<Person> findByLastnameLikeWithPageable(String lastname, Pageable pageable);
List<Person> findByLastnameIgnoreCaseIn(String... lastname);
/**
* Returns all {@link Person}s with a firstname contained in the given varargs.
*

View File

@@ -20,15 +20,19 @@ import static org.springframework.data.domain.ExampleMatcher.*;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.Value;
import lombok.With;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import reactor.test.StepVerifier;
import java.util.Arrays;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import javax.annotation.Nullable;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.BeanClassLoaderAware;
@@ -44,10 +48,9 @@ import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.repository.support.ReactiveMongoRepositoryFactory;
import org.springframework.data.mongodb.repository.support.SimpleReactiveMongoRepository;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.test.context.junit.jupiter.SpringExtension;
import org.springframework.util.ClassUtils;
/**
@@ -56,19 +59,22 @@ import org.springframework.util.ClassUtils;
* @author Mark Paluch
* @author Christoph Strobl
* @author Ruben J Garcia
* @author Clément Petit
*/
@RunWith(SpringRunner.class)
@ExtendWith(SpringExtension.class)
@ContextConfiguration("classpath:reactive-infrastructure.xml")
public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanFactoryAware {
@Autowired private ReactiveMongoTemplate template;
ReactiveMongoRepositoryFactory factory;
ClassLoader classLoader;
BeanFactory beanFactory;
ReactivePersonRepostitory repository;
private ReactiveMongoRepositoryFactory factory;
private ClassLoader classLoader;
private BeanFactory beanFactory;
private ReactivePersonRepository repository;
private ReactiveImmutablePersonRepository immutableRepository;
private ReactivePerson dave, oliver, carter, boyd, stefan, leroi, alicia;
private ImmutableReactivePerson keith, james, mariah;
@Override
public void setBeanClassLoader(ClassLoader classLoader) {
@@ -80,8 +86,8 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
this.beanFactory = beanFactory;
}
@Before
public void setUp() {
@BeforeEach
void setUp() {
factory = new ReactiveMongoRepositoryFactory(template);
factory.setRepositoryBaseClass(SimpleReactiveMongoRepository.class);
@@ -89,9 +95,11 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
factory.setBeanFactory(beanFactory);
factory.setEvaluationContextProvider(ReactiveQueryMethodEvaluationContextProvider.DEFAULT);
repository = factory.getRepository(ReactivePersonRepostitory.class);
repository = factory.getRepository(ReactivePersonRepository.class);
immutableRepository = factory.getRepository(ReactiveImmutablePersonRepository.class);
repository.deleteAll().as(StepVerifier::create).verifyComplete();
immutableRepository.deleteAll().as(StepVerifier::create).verifyComplete();
dave = new ReactivePerson("Dave", "Matthews", 42);
oliver = new ReactivePerson("Oliver August", "Matthews", 4);
@@ -100,6 +108,9 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
stefan = new ReactivePerson("Stefan", "Lessard", 34);
leroi = new ReactivePerson("Leroi", "Moore", 41);
alicia = new ReactivePerson("Alicia", "Keys", 30);
keith = new ImmutableReactivePerson(null, "Keith", "Urban", 53);
james = new ImmutableReactivePerson(null, "James", "Arthur", 33);
mariah = new ImmutableReactivePerson(null, "Mariah", "Carey", 51);
repository.saveAll(Arrays.asList(oliver, dave, carter, boyd, stefan, leroi, alicia)).as(StepVerifier::create) //
.expectNextCount(7) //
@@ -107,78 +118,78 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void existsByIdShouldReturnTrueForExistingObject() {
void existsByIdShouldReturnTrueForExistingObject() {
repository.existsById(dave.id).as(StepVerifier::create).expectNext(true).verifyComplete();
}
@Test // DATAMONGO-1444
public void existsByIdShouldReturnFalseForAbsentObject() {
void existsByIdShouldReturnFalseForAbsentObject() {
repository.existsById("unknown").as(StepVerifier::create).expectNext(false).verifyComplete();
}
@Test // DATAMONGO-1444
public void existsByMonoOfIdShouldReturnTrueForExistingObject() {
void existsByMonoOfIdShouldReturnTrueForExistingObject() {
repository.existsById(Mono.just(dave.id)).as(StepVerifier::create).expectNext(true).verifyComplete();
}
@Test // DATAMONGO-1712
public void existsByFluxOfIdShouldReturnTrueForExistingObject() {
void existsByFluxOfIdShouldReturnTrueForExistingObject() {
repository.existsById(Flux.just(dave.id, oliver.id)).as(StepVerifier::create).expectNext(true).verifyComplete();
}
@Test // DATAMONGO-1444
public void existsByEmptyMonoOfIdShouldReturnEmptyMono() {
void existsByEmptyMonoOfIdShouldReturnEmptyMono() {
repository.existsById(Mono.empty()).as(StepVerifier::create).verifyComplete();
}
@Test // DATAMONGO-1444
public void findByIdShouldReturnObject() {
void findByIdShouldReturnObject() {
repository.findById(dave.id).as(StepVerifier::create).expectNext(dave).verifyComplete();
}
@Test // DATAMONGO-1444
public void findByIdShouldCompleteWithoutValueForAbsentObject() {
void findByIdShouldCompleteWithoutValueForAbsentObject() {
repository.findById("unknown").as(StepVerifier::create).verifyComplete();
}
@Test // DATAMONGO-1444
public void findByIdByMonoOfIdShouldReturnTrueForExistingObject() {
void findByIdByMonoOfIdShouldReturnTrueForExistingObject() {
repository.findById(Mono.just(dave.id)).as(StepVerifier::create).expectNext(dave).verifyComplete();
}
@Test // DATAMONGO-1712
public void findByIdByFluxOfIdShouldReturnTrueForExistingObject() {
void findByIdByFluxOfIdShouldReturnTrueForExistingObject() {
repository.findById(Flux.just(dave.id, oliver.id)).as(StepVerifier::create).expectNext(dave).verifyComplete();
}
@Test // DATAMONGO-1444
public void findByIdByEmptyMonoOfIdShouldReturnEmptyMono() {
void findByIdByEmptyMonoOfIdShouldReturnEmptyMono() {
repository.findById(Mono.empty()).as(StepVerifier::create).verifyComplete();
}
@Test // DATAMONGO-1444
public void findAllShouldReturnAllResults() {
void findAllShouldReturnAllResults() {
repository.findAll().as(StepVerifier::create).expectNextCount(7).verifyComplete();
}
@Test // DATAMONGO-1444
public void findAllByIterableOfIdShouldReturnResults() {
void findAllByIterableOfIdShouldReturnResults() {
repository.findAllById(Arrays.asList(dave.id, boyd.id)).as(StepVerifier::create).expectNextCount(2)
.verifyComplete();
}
@Test // DATAMONGO-1444
public void findAllByPublisherOfIdShouldReturnResults() {
void findAllByPublisherOfIdShouldReturnResults() {
repository.findAllById(Flux.just(dave.id, boyd.id)).as(StepVerifier::create).expectNextCount(2).verifyComplete();
}
@Test // DATAMONGO-1444
public void findAllByEmptyPublisherOfIdShouldReturnResults() {
void findAllByEmptyPublisherOfIdShouldReturnResults() {
repository.findAllById(Flux.empty()).as(StepVerifier::create).verifyComplete();
}
@Test // DATAMONGO-1444
public void findAllWithSortShouldReturnResults() {
void findAllWithSortShouldReturnResults() {
repository.findAll(Sort.by(new Order(Direction.ASC, "age"))).as(StepVerifier::create) //
.expectNextCount(7) //
@@ -186,12 +197,12 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void countShouldReturnNumberOfRecords() {
void countShouldReturnNumberOfRecords() {
repository.count().as(StepVerifier::create).expectNext(7L).verifyComplete();
}
@Test // DATAMONGO-1444
public void insertEntityShouldInsertEntity() {
void insertEntityShouldInsertEntity() {
repository.deleteAll().as(StepVerifier::create).verifyComplete();
@@ -203,7 +214,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void insertShouldDeferredWrite() {
void insertShouldDeferredWrite() {
ReactivePerson person = new ReactivePerson("Homer", "Simpson", 36);
@@ -213,7 +224,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void insertIterableOfEntitiesShouldInsertEntity() {
void insertIterableOfEntitiesShouldInsertEntity() {
repository.deleteAll().as(StepVerifier::create).verifyComplete();
@@ -231,7 +242,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void insertPublisherOfEntitiesShouldInsertEntity() {
void insertPublisherOfEntitiesShouldInsertEntity() {
repository.deleteAll().as(StepVerifier::create).verifyComplete();
@@ -247,7 +258,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void saveEntityShouldUpdateExistingEntity() {
void saveEntityShouldUpdateExistingEntity() {
dave.setFirstname("Hello, Dave");
dave.setLastname("Bowman");
@@ -264,7 +275,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void saveEntityShouldInsertNewEntity() {
void saveEntityShouldInsertNewEntity() {
ReactivePerson person = new ReactivePerson("Homer", "Simpson", 36);
@@ -278,7 +289,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void saveIterableOfNewEntitiesShouldInsertEntity() {
void saveIterableOfNewEntitiesShouldInsertEntity() {
repository.deleteAll().as(StepVerifier::create).verifyComplete();
@@ -294,7 +305,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void saveIterableOfMixedEntitiesShouldInsertEntity() {
void saveIterableOfMixedEntitiesShouldInsertEntity() {
ReactivePerson person = new ReactivePerson("Homer", "Simpson", 36);
@@ -310,7 +321,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void savePublisherOfEntitiesShouldInsertEntity() {
void savePublisherOfEntitiesShouldInsertEntity() {
repository.deleteAll().as(StepVerifier::create).verifyComplete();
@@ -325,8 +336,20 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
assertThat(boyd.getId()).isNotNull();
}
@Test // GH-3609
void savePublisherOfImmutableEntitiesShouldInsertEntity() {
immutableRepository.deleteAll().as(StepVerifier::create).verifyComplete();
immutableRepository.saveAll(Flux.just(keith)).as(StepVerifier::create) //
.consumeNextWith(actual -> {
assertThat(actual.id).isNotNull();
}) //
.verifyComplete();
}
@Test // DATAMONGO-1444
public void deleteAllShouldRemoveEntities() {
void deleteAllShouldRemoveEntities() {
repository.deleteAll().as(StepVerifier::create).verifyComplete();
@@ -334,7 +357,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void deleteByIdShouldRemoveEntity() {
void deleteByIdShouldRemoveEntity() {
repository.deleteById(dave.id).as(StepVerifier::create).verifyComplete();
@@ -342,7 +365,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1712
public void deleteByIdUsingMonoShouldRemoveEntity() {
void deleteByIdUsingMonoShouldRemoveEntity() {
repository.deleteById(Mono.just(dave.id)).as(StepVerifier::create).verifyComplete();
@@ -350,7 +373,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1712
public void deleteByIdUsingFluxShouldRemoveEntity() {
void deleteByIdUsingFluxShouldRemoveEntity() {
repository.deleteById(Flux.just(dave.id, oliver.id)).as(StepVerifier::create).verifyComplete();
@@ -359,7 +382,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void deleteShouldRemoveEntity() {
void deleteShouldRemoveEntity() {
repository.delete(dave).as(StepVerifier::create).verifyComplete();
@@ -368,7 +391,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void deleteIterableOfEntitiesShouldRemoveEntities() {
void deleteIterableOfEntitiesShouldRemoveEntities() {
repository.deleteAll(Arrays.asList(dave, boyd)).as(StepVerifier::create).verifyComplete();
@@ -378,7 +401,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1444
public void deletePublisherOfEntitiesShouldRemoveEntities() {
void deletePublisherOfEntitiesShouldRemoveEntities() {
repository.deleteAll(Flux.just(dave, boyd)).as(StepVerifier::create).verifyComplete();
@@ -388,7 +411,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1619
public void findOneByExampleShouldReturnObject() {
void findOneByExampleShouldReturnObject() {
Example<ReactivePerson> example = Example.of(dave);
@@ -396,7 +419,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1619
public void findAllByExampleShouldReturnObjects() {
void findAllByExampleShouldReturnObjects() {
Example<ReactivePerson> example = Example.of(dave, matching().withIgnorePaths("id", "age", "firstname"));
@@ -404,7 +427,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1619
public void findAllByExampleAndSortShouldReturnObjects() {
void findAllByExampleAndSortShouldReturnObjects() {
Example<ReactivePerson> example = Example.of(dave, matching().withIgnorePaths("id", "age", "firstname"));
@@ -413,7 +436,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1619
public void countByExampleShouldCountObjects() {
void countByExampleShouldCountObjects() {
Example<ReactivePerson> example = Example.of(dave, matching().withIgnorePaths("id", "age", "firstname"));
@@ -421,7 +444,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1619
public void existsByExampleShouldReturnExisting() {
void existsByExampleShouldReturnExisting() {
Example<ReactivePerson> example = Example.of(dave, matching().withIgnorePaths("id", "age", "firstname"));
@@ -429,7 +452,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1619
public void existsByExampleShouldReturnNonExisting() {
void existsByExampleShouldReturnNonExisting() {
Example<ReactivePerson> example = Example.of(new ReactivePerson("foo", "bar", -1));
@@ -437,7 +460,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1619
public void findOneShouldEmitIncorrectResultSizeDataAccessExceptionWhenMoreThanOneElementFound() {
void findOneShouldEmitIncorrectResultSizeDataAccessExceptionWhenMoreThanOneElementFound() {
Example<ReactivePerson> example = Example.of(new ReactivePerson(null, "Matthews", -1),
matching().withIgnorePaths("age"));
@@ -446,19 +469,23 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
}
@Test // DATAMONGO-1907
public void findOneByExampleWithoutResultShouldCompleteEmpty() {
void findOneByExampleWithoutResultShouldCompleteEmpty() {
Example<ReactivePerson> example = Example.of(new ReactivePerson("foo", "bar", -1));
repository.findOne(example).as(StepVerifier::create).verifyComplete();
}
interface ReactivePersonRepostitory extends ReactiveMongoRepository<ReactivePerson, String> {
interface ReactivePersonRepository extends ReactiveMongoRepository<ReactivePerson, String> {
Flux<ReactivePerson> findByLastname(String lastname);
}
interface ReactiveImmutablePersonRepository extends ReactiveMongoRepository<ImmutableReactivePerson, String> {
}
@Data
@NoArgsConstructor
static class ReactivePerson {
@@ -469,11 +496,30 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
String lastname;
int age;
public ReactivePerson(String firstname, String lastname, int age) {
ReactivePerson(String firstname, String lastname, int age) {
this.firstname = firstname;
this.lastname = lastname;
this.age = age;
}
}
@With
@Value
static class ImmutableReactivePerson {
@Id String id;
String firstname;
String lastname;
int age;
ImmutableReactivePerson(@Nullable String id, String firstname, String lastname, int age) {
this.id = id;
this.firstname = firstname;
this.lastname = lastname;
this.age = age;
}
}
}

View File

@@ -159,6 +159,14 @@ public class StringBasedAggregationUnitTests {
assertThat(executeAggregation("returnCollection").result).isEqualTo(expected);
}
@Test // GH-3623
public void returnNullWhenSingleResultIsNotPresent() {
when(aggregationResults.getMappedResults()).thenReturn(Collections.emptyList());
assertThat(executeAggregation("simpleReturnType").result).isNull();
}
@Test // DATAMONGO-2153
public void returnRawResultType() {
assertThat(executeAggregation("returnRawResultType").result).isEqualTo(aggregationResults);
@@ -312,6 +320,9 @@ public class StringBasedAggregationUnitTests {
@Aggregation(RAW_GROUP_BY_LASTNAME_STRING)
Page<Person> invalidPageReturnType(Pageable page);
@Aggregation(RAW_GROUP_BY_LASTNAME_STRING)
String simpleReturnType();
}
static class PersonAggregate {

View File

@@ -19,6 +19,8 @@ import example.first.First
import io.mockk.mockk
import io.mockk.verify
import org.junit.Test
import org.springframework.data.mongodb.core.aggregation.Aggregation
import org.springframework.data.mongodb.core.aggregation.TypedAggregation
import org.springframework.data.mongodb.core.query.NearQuery
import org.springframework.data.mongodb.core.query.Query
import org.springframework.data.mongodb.core.query.Update
@@ -28,6 +30,7 @@ import reactor.core.publisher.Mono
* @author Sebastien Deleuze
* @author Christoph Strobl
* @author Mark Paluch
* @author Wonwoo Lee
*/
class ReactiveMongoOperationsExtensionsTests {
@@ -598,7 +601,6 @@ class ReactiveMongoOperationsExtensionsTests {
verify { operations.findDistinct(query, "field", "collection", First::class.java, String::class.java) }
}
@Test // DATAMONGO-1761
@Suppress("DEPRECATION")
fun `findDistinct(Query, String, KClass) should call java counterpart`() {
@@ -606,6 +608,55 @@ class ReactiveMongoOperationsExtensionsTests {
val query = mockk<Query>()
operations.findDistinct<String>(query, "field", First::class)
verify { operations.findDistinct(query, "field", First::class.java, String::class.java) }
verify {
operations.findDistinct(
query,
"field",
First::class.java,
String::class.java
)
}
}
@Test // #893
fun `aggregate(TypedAggregation, String, KClass) should call java counterpart`() {
val aggregation = mockk<TypedAggregation<String>>()
operations.aggregate<First>(aggregation, "foo")
verify { operations.aggregate(aggregation, "foo", First::class.java) }
}
@Test // #893
fun `aggregate(TypedAggregation, KClass) should call java counterpart`() {
val aggregation = mockk<TypedAggregation<String>>()
operations.aggregate<First>(aggregation)
verify { operations.aggregate(aggregation, First::class.java) }
}
@Test // #893
fun `aggregate(Aggregation, KClass) should call java counterpart`() {
val aggregation = mockk<Aggregation>()
operations.aggregate<String, First>(aggregation)
verify {
operations.aggregate(
aggregation,
String::class.java,
First::class.java
)
}
}
@Test // #893
fun `aggregate(Aggregation, String) should call java counterpart`() {
val aggregation = mockk<Aggregation>()
operations.aggregate<First>(aggregation, "foo")
verify { operations.aggregate(aggregation, "foo", First::class.java) }
}
}

View File

@@ -1446,6 +1446,7 @@ The geo-near operations return a `GeoResults` wrapper object that encapsulates `
MongoDB supports https://geojson.org/[GeoJSON] and simple (legacy) coordinate pairs for geospatial data. Those formats can both be used for storing as well as querying data. See the https://docs.mongodb.org/manual/core/2dsphere/#geospatial-indexes-store-geojson/[MongoDB manual on GeoJSON support] to learn about requirements and restrictions.
[[mongo.geo-json.domain.classes]]
==== GeoJSON Types in Domain Classes
Usage of https://geojson.org/[GeoJSON] types in domain classes is straightforward. The `org.springframework.data.mongodb.core.geo` package contains types such as `GeoJsonPoint`, `GeoJsonPolygon`, and others. These types are extend the existing `org.springframework.data.geo` types. The following example uses a `GeoJsonPoint`:
@@ -1469,6 +1470,7 @@ public class Store {
----
====
[[mongo.geo-json.query-methods]]
==== GeoJSON Types in Repository Query Methods
Using GeoJSON types as repository query parameters forces usage of the `$geometry` operator when creating the query, as the following example shows:
@@ -1529,6 +1531,7 @@ repo.findByLocationWithin( <4>
<4> Use the legacy format `$polygon` operator.
====
[[mongo.geo-json.metrics]]
==== Metrics and Distance calculation
Then MongoDB `$geoNear` operator allows usage of a GeoJSON Point or legacy coordinate pairs.
@@ -1700,6 +1703,29 @@ Returning the 3 Documents just like the GeoJSON variant:
<4> Distance from center point in _Kilometers_ - take it times 1000 to match _Meters_ of the GeoJSON variant.
====
[[mongo.geo-json.jackson-modules]]
==== GeoJSON Jackson Modules
By using the <<core.web>>, Spring Data registers additional Jackson ``Modules``s to the `ObjectMapper` for deserializing common Spring Data domain types.
Please refer to the <<core.web.basic.jackson-mappers>> section to learn more about the infrastructure setup of this feature.
The MongoDB module additionally registers ``JsonDeserializer``s for the following GeoJSON types via its `GeoJsonConfiguration` exposing the `GeoJsonModule`.
----
org.springframework.data.mongodb.core.geo.GeoJsonPoint
org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint
org.springframework.data.mongodb.core.geo.GeoJsonLineString
org.springframework.data.mongodb.core.geo.GeoJsonMultiLineString
org.springframework.data.mongodb.core.geo.GeoJsonPolygon
org.springframework.data.mongodb.core.geo.GeoJsonMultiPolygon
----
[NOTE]
====
The `GeoJsonModule` only registers ``JsonDeserializer``s!
The next major version (`4.0`) will register both, ``JsonDeserializer``s and ``JsonSerializer``s for GeoJSON types by default.
====
[[mongo.textsearch]]
=== Full-text Queries
@@ -1731,7 +1757,7 @@ A query searching for `coffee cake` can be defined and run as follows:
[source,java]
----
Query query = TextQuery
.searching(new TextCriteria().matchingAny("coffee", "cake"));
.queryText(new TextCriteria().matchingAny("coffee", "cake"));
List<Document> page = template.find(query, Document.class);
----
@@ -1744,7 +1770,7 @@ To sort results by relevance according to the `weights` use `TextQuery.sortBySco
[source,java]
----
Query query = TextQuery
.searching(new TextCriteria().matchingAny("coffee", "cake"))
.queryText(new TextCriteria().matchingAny("coffee", "cake"))
.sortByScore() <1>
.includeScore(); <2>
@@ -1759,8 +1785,8 @@ You can exclude search terms by prefixing the term with `-` or by using `notMatc
[source,java]
----
// search for 'coffee' and not 'cake'
TextQuery.searching(new TextCriteria().matching("coffee").matching("-cake"));
TextQuery.searching(new TextCriteria().matching("coffee").notMatching("cake"));
TextQuery.queryText(new TextCriteria().matching("coffee").matching("-cake"));
TextQuery.queryText(new TextCriteria().matching("coffee").notMatching("cake"));
----
`TextCriteria.matching` takes the provided term as is. Therefore, you can define phrases by putting them between double quotation marks (for example, `\"coffee cake\")` or using by `TextCriteria.phrase.` The following example shows both ways of defining a phrase:
@@ -1768,8 +1794,8 @@ TextQuery.searching(new TextCriteria().matching("coffee").notMatching("cake"));
[source,java]
----
// search for phrase 'coffee cake'
TextQuery.searching(new TextCriteria().matching("\"coffee cake\""));
TextQuery.searching(new TextCriteria().phrase("coffee cake"));
TextQuery.queryText(new TextCriteria().matching("\"coffee cake\""));
TextQuery.queryText(new TextCriteria().phrase("coffee cake"));
----
You can set flags for `$caseSensitive` and `$diacriticSensitive` by using the corresponding methods on `TextCriteria`. Note that these two optional flags have been introduced in MongoDB 3.2 and are not included in the query unless explicitly set.
@@ -1860,8 +1886,6 @@ AggregationResults<TagCount> results = template.aggregate(aggregation, "tags", T
WARNING: Indexes are only used if the collation used for the operation matches the index collation.
include::./mongo-json-schema.adoc[leveloffset=+1]
<<mongo.repositories>> support `Collations` via the `collation` attribute of the `@Query` annotation.
.Collation support for Repositories
@@ -1902,186 +1926,7 @@ as shown in (1) and (2), will be included when creating the index.
TIP: The most specifc `Collation` outroules potentially defined others. Which means Method argument over query method annotation over doamin type annotation.
====
[[mongo.jsonSchema]]
=== JSON Schema
As of version 3.6, MongoDB supports collections that validate documents against a provided https://docs.mongodb.com/manual/core/schema-validation/#json-schema[JSON Schema].
The schema itself and both validation action and level can be defined when creating the collection, as the following example shows:
.Sample JSON schema
====
[source,json]
----
{
"type": "object", <1>
"required": [ "firstname", "lastname" ], <2>
"properties": { <3>
"firstname": { <4>
"type": "string",
"enum": [ "luke", "han" ]
},
"address": { <5>
"type": "object",
"properties": {
"postCode": { "type": "string", "minLength": 4, "maxLength": 5 }
}
}
}
}
----
<1> JSON schema documents always describe a whole document from its root. A schema is a schema object itself that can contain
embedded schema objects that describe properties and subdocuments.
<2> `required` is a property that describes which properties are required in a document. It can be specified optionally, along with other
schema constraints. See MongoDB's documentation on https://docs.mongodb.com/manual/reference/operator/query/jsonSchema/#available-keywords[available keywords].
<3> `properties` is related to a schema object that describes an `object` type. It contains property-specific schema constraints.
<4> `firstname` specifies constraints for the `firsname` field inside the document. Here, it is a string-based `properties` element declaring
possible field values.
<5> `address` is a subdocument defining a schema for values in its `postCode` field.
====
You can provide a schema either by specifying a schema document (that is, by using the `Document` API to parse or build a document object) or by building it with Spring Data's JSON schema utilities in `org.springframework.data.mongodb.core.schema`. `MongoJsonSchema` is the entry point for all JSON schema-related operations. The following example shows how use `MongoJsonSchema.builder()` to create a JSON schema:
.Creating a JSON schema
====
[source,java]
----
MongoJsonSchema.builder() <1>
.required("firstname", "lastname") <2>
.properties(
string("firstname").possibleValues("luke", "han"), <3>
object("address")
.properties(string("postCode").minLength(4).maxLength(5)))
.build(); <4>
----
<1> Obtain a schema builder to configure the schema with a fluent API.
<2> Configure required properties.
<3> Configure the String-typed `firstname` field, allowing only `luke` and `han` values. Properties can be typed or untyped. Use a static import of `JsonSchemaProperty` to make the syntax slightly more compact and to get entry points such as `string(…)`.
<4> Build the schema object. Use the schema to create either a collection or <<mongodb-template-query.criteria,query documents>>.
====
There are already some predefined and strongly typed schema objects (`JsonSchemaObject` and `JsonSchemaProperty`) available
through static methods on the gateway interfaces.
However, you may need to build custom property validation rules, which can be created through the builder API, as the following example shows:
[source,java]
----
// "birthdate" : { "bsonType": "date" }
JsonSchemaProperty.named("birthdate").ofType(Type.dateType());
// "birthdate" : { "bsonType": "date", "description", "Must be a date" }
JsonSchemaProperty.named("birthdate").with(JsonSchemaObject.of(Type.dateType()).description("Must be a date"));
----
The Schema builder also provides support for https://docs.mongodb.com/manual/core/security-client-side-encryption/[Client-Side Field Level Encryption]. Please refer to <<mongo.jsonSchema.encrypted-fields>> for more information,
`CollectionOptions` provides the entry point to schema support for collections, as the following example shows:
.Create collection with `$jsonSchema`
====
[source,java]
----
MongoJsonSchema schema = MongoJsonSchema.builder().required("firstname", "lastname").build();
template.createCollection(Person.class, CollectionOptions.empty().schema(schema));
----
====
You can use a schema to query any collection for documents that match a given structure defined by a JSON schema, as the following example shows:
.Query for Documents matching a `$jsonSchema`
====
[source,java]
----
MongoJsonSchema schema = MongoJsonSchema.builder().required("firstname", "lastname").build();
template.find(query(matchingDocumentStructure(schema)), Person.class);
----
====
The following table shows the supported JSON schema types:
[cols="3,1,6", options="header"]
.Supported JSON schema types
|===
| Schema Type
| Java Type
| Schema Properties
| `untyped`
| -
| `description`, generated `description`, `enum`, `allOf`, `anyOf`, `oneOf`, `not`
| `object`
| `Object`
| `required`, `additionalProperties`, `properties`, `minProperties`, `maxProperties`, `patternProperties`
| `array`
| any array except `byte[]`
| `uniqueItems`, `additionalItems`, `items`, `minItems`, `maxItems`
| `string`
| `String`
| `minLength`, `maxLentgth`, `pattern`
| `int`
| `int`, `Integer`
| `multipleOf`, `minimum`, `exclusiveMinimum`, `maximum`, `exclusiveMaximum`
| `long`
| `long`, `Long`
| `multipleOf`, `minimum`, `exclusiveMinimum`, `maximum`, `exclusiveMaximum`
| `double`
| `float`, `Float`, `double`, `Double`
| `multipleOf`, `minimum`, `exclusiveMinimum`, `maximum`, `exclusiveMaximum`
| `decimal`
| `BigDecimal`
| `multipleOf`, `minimum`, `exclusiveMinimum`, `maximum`, `exclusiveMaximum`
| `number`
| `Number`
| `multipleOf`, `minimum`, `exclusiveMinimum`, `maximum`, `exclusiveMaximum`
| `binData`
| `byte[]`
| (none)
| `boolean`
| `boolean`, `Boolean`
| (none)
| `null`
| `null`
| (none)
| `objectId`
| `ObjectId`
| (none)
| `date`
| `java.util.Date`
| (none)
| `timestamp`
| `BsonTimestamp`
| (none)
| `regex`
| `java.util.regex.Pattern`
| (none)
|===
NOTE: `untyped` is a generic type that is inherited by all typed schema types. It provides all `untyped` schema properties to typed schema types.
For more information, see https://docs.mongodb.com/manual/reference/operator/query/jsonSchema/#op._S_jsonSchema[$jsonSchema].
include::./mongo-json-schema.adoc[leveloffset=+1]
[[mongo.query.fluent-template-api]]
=== Fluent Template API
@@ -2221,6 +2066,7 @@ With the introduction of <<mongo.transactions>> this was no longer possible beca
So in version 2.x `MongoOperations.count()` would use the collection statistics if no transaction was in progress, and the aggregation variant if so.
As of Spring Data MongoDB 3.x any `count` operation uses regardless the existence of filter criteria the aggregation-based count approach via MongoDBs `countDocuments`.
If the application is fine with the limitations of working upon collection statistics `MongoOperations.estimatedCount()` offers an alternative.
[NOTE]
====

View File

@@ -97,3 +97,10 @@ Query query = new Query(new Criteria().alike(example));
List<Person> result = template.find(query, Person.class);
----
====
[NOTE]
====
`UntypedExampleMatcher` is likely the right choice for you if you are storing different entities within a single collection or opted out of writing <<mongo-template.type-mapping,type hints>>.
Also, keep in mind that using `@TypeAlias` requires eager initialization of the `MappingContext`. To do so, configure `initialEntitySet` to to ensure proper alias resolution for read operations.
====

View File

@@ -1,6 +1,156 @@
Spring Data MongoDB Changelog
=============================
Changes in version 3.1.9 (2021-05-14)
-------------------------------------
Changes in version 3.2.0 (2021-04-14)
-------------------------------------
* #3623 - `@Aggregation` repository query method causes `NullPointerException` when the result is empty.
* #3621 - Upgrade to MongoDB Java Drivers 4.2.3.
* #3612 - Upgrade to MongoDB 4.4 on CI.
* #3601 - Criteria object not allowing to use field names with underscore in them.
* #3583 - Support aggregation expression on fields projection.
* #3414 - Criteria or toEquals fail if contains regex [DATAMONGO-2559].
Changes in version 3.1.8 (2021-04-14)
-------------------------------------
* #3623 - `@Aggregation` repository query method causes `NullPointerException` when the result is empty.
* #3601 - Criteria object not allowing to use field names with underscore in them.
* #3414 - Criteria or toEquals fail if contains regex [DATAMONGO-2559].
Changes in version 3.0.9.RELEASE (2021-04-14)
---------------------------------------------
* #3623 - `@Aggregation` repository query method causes `NullPointerException` when the result is empty.
* #3609 - SimpleReactiveMongoRepository#saveAll does not populate @Id property if it is immutable.
* #3414 - Criteria or toEquals fail if contains regex [DATAMONGO-2559].
Changes in version 3.1.7 (2021-03-31)
-------------------------------------
* #3613 - Use StringUtils.replace(…) instead of String.replaceAll(…) for mapKeyDotReplacement.
* #3609 - SimpleReactiveMongoRepository#saveAll does not populate @Id property if it is immutable.
Changes in version 3.2.0-RC1 (2021-03-31)
-----------------------------------------
* #3613 - Use StringUtils.replace(…) instead of String.replaceAll(…) for mapKeyDotReplacement.
* #3609 - SimpleReactiveMongoRepository#saveAll does not populate @Id property if it is immutable.
* #3600 - Rename Embedded annotation -> Unwrapped.
* #3583 - Support aggregation expression on fields projection.
Changes in version 3.2.0-M5 (2021-03-17)
----------------------------------------
* #3592 - Remove @Persistent from entity-scan include filters.
* #3590 - Embedded sharding keys are not correctly picked up from the shardKeySource Document.
* #3580 - Fix CustomConverter conversion lookup.
* #3579 - Upgrade to MongoDB Java Drivers 4.2.2.
* #3575 - Introduce ConversionContext and clean up MappingMongoConverter.
* #3573 - Json Schema section appears twice in reference documentation.
* #3571 - Introduce ConversionContext and clean up MappingMongoConverter.
* #3570 - Incorrect class casting cause ClassCastException when save java.util.Collection using MongoTemplate.
* #3568 - MongoSocketWriteException may be translated into DataAccessResourceFailureException.
* #3566 - Couldn't find PersistentEntity for type java.lang.Object when updating a field with suffix "class".
* #3552 - UpdateMapper drops numeric keys in Maps.
* #3395 - Derived findBy…IgnoreCaseIn query doesn't return expected results [DATAMONGO-2540].
* #3286 - Add possibility to use Collection<Criteria> as parameter in and/or/nor operators [DATAMONGO-2428].
* #2911 - ensureNotIterable in MongoTemplate only checks for array type [DATAMONGO-2044].
* #590 - DATAMONGO-2044 make ensureNotIterable actually check if object is iterable.
Changes in version 3.1.6 (2021-03-17)
-------------------------------------
* #3592 - Remove @Persistent from entity-scan include filters.
* #3590 - Embedded sharding keys are not correctly picked up from the shardKeySource Document.
* #3589 - Upgrade to MongoDB Driver 4.1.2.
* #3573 - Json Schema section appears twice in reference documentation.
* #3568 - MongoSocketWriteException may be translated into DataAccessResourceFailureException.
* #3566 - Couldn't find PersistentEntity for type java.lang.Object when updating a field with suffix "class".
* #3552 - UpdateMapper drops numeric keys in Maps.
* #3395 - Derived findBy…IgnoreCaseIn query doesn't return expected results [DATAMONGO-2540].
Changes in version 3.0.8.RELEASE (2021-03-17)
---------------------------------------------
* #3590 - Embedded sharding keys are not correctly picked up from the shardKeySource Document.
* #3588 - Upgrade to MongoDB Driver 4.0.6.
* #3573 - Json Schema section appears twice in reference documentation.
* #3568 - MongoSocketWriteException may be translated into DataAccessResourceFailureException.
* #3566 - Couldn't find PersistentEntity for type java.lang.Object when updating a field with suffix "class".
* #3552 - UpdateMapper drops numeric keys in Maps.
* #3395 - Derived findBy…IgnoreCaseIn query doesn't return expected results [DATAMONGO-2540].
Changes in version 3.2.0-M4 (2021-02-18)
----------------------------------------
Changes in version 3.1.5 (2021-02-18)
-------------------------------------
Changes in version 3.2.0-M3 (2021-02-17)
----------------------------------------
* #3553 - Upgrade to MongoDB driver 4.2.0.
* #3546 - org.bson.codecs.configuration.CodecConfigurationException: The uuidRepresentation has not been specified, so the UUID cannot be encoded.
* #3544 - alike Criteria can't add andOperator.
* #3542 - Relax field name checks for TypedAggregations.
* #3540 - Allow access to mongoDatabaseFactory used in ReactiveMongoTemplate.
* #3529 - Update repository after GitHub issues migration.
* #3525 - Bug in full text query documentation [DATAMONGO-2673].
* #3517 - GeoJson: Improper Deserialization of Document with a GeoJsonPolygon [DATAMONGO-2664].
* #3508 - Add ReactiveMongoOperations.aggregate(…) Kotlin extension [DATAMONGO-2655].
* #3474 - Search by alike() criteria is broken when type alias information is not available [DATAMONGO-2620].
* #3055 - Improve count() and countDocuments() mapping documentation and/or method availability [DATAMONGO-2192].
* #2803 - Support flattening embedded/nested objects [DATAMONGO-1902].
Changes in version 3.1.4 (2021-02-17)
-------------------------------------
* #3546 - org.bson.codecs.configuration.CodecConfigurationException: The uuidRepresentation has not been specified, so the UUID cannot be encoded.
* #3544 - alike Criteria can't add andOperator.
* #3540 - Allow access to mongoDatabaseFactory used in ReactiveMongoTemplate.
* #3525 - Bug in full text query documentation [DATAMONGO-2673].
* #3517 - GeoJson: Improper Deserialization of Document with a GeoJsonPolygon [DATAMONGO-2664].
* #3508 - Add ReactiveMongoOperations.aggregate(…) Kotlin extension [DATAMONGO-2655].
* #3474 - Search by alike() criteria is broken when type alias information is not available [DATAMONGO-2620].
* #3055 - Improve count() and countDocuments() mapping documentation and/or method availability [DATAMONGO-2192].
Changes in version 3.0.7.RELEASE (2021-02-17)
---------------------------------------------
* DATAMONGO-2671 - DateFromParts millisecondsOf returns "milliseconds" as $dateFromParts function but it should be millisecond.
* DATAMONGO-2665 - Update CI jobs with Docker Login.
* #3544 - alike Criteria can't add andOperator.
* #3534 - Update copyright year to 2021.
* #3529 - Update repository after GitHub issues migration.
* #3525 - Bug in full text query documentation [DATAMONGO-2673].
* #3517 - GeoJson: Improper Deserialization of Document with a GeoJsonPolygon [DATAMONGO-2664].
* #3474 - Search by alike() criteria is broken when type alias information is not available [DATAMONGO-2620].
Changes in version 2.2.13.RELEASE (2021-02-17)
----------------------------------------------
* #3544 - alike Criteria can't add andOperator.
* #3534 - Update copyright year to 2021.
* #3529 - Update repository after GitHub issues migration.
* #3525 - Bug in full text query documentation [DATAMONGO-2673].
Changes in version 3.2.0-M2 (2021-01-13)
----------------------------------------
* DATAMONGO-2671 - DateFromParts millisecondsOf returns "milliseconds" as $dateFromParts function but it should be millisecond.
* DATAMONGO-2665 - Update CI jobs with Docker Login.
* DATAMONGO-2651 - Allow AggregationExpression as part of group operation.
* #3534 - Update copyright year to 2021.
* #3529 - Update repository after GitHub issues migration.
* #3515 - Deprecate KPropertyPath in favor of Spring Data Common's KPropertyPath [DATAMONGO-2662].
Changes in version 3.1.3 (2021-01-13)
-------------------------------------
* DATAMONGO-2671 - DateFromParts millisecondsOf returns "milliseconds" as $dateFromParts function but it should be millisecond.
@@ -3254,6 +3404,22 @@ Repository

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 3.1.3 (2020.0.3)
Spring Data MongoDB 3.1.9 (2020.0.9)
Copyright (c) [2010-2019] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").
@@ -22,4 +22,10 @@ conditions of the subcomponent's license, as noted in the LICENSE file.