Compare commits

..

102 Commits

Author SHA1 Message Date
Mark Paluch
56972d63b6 After release cleanups.
See #4206
2022-11-18 10:35:10 +01:00
Mark Paluch
4dabfd790a Prepare next development iteration.
See #4206
2022-11-18 10:35:08 +01:00
Mark Paluch
4bca0ca015 Release version 3.3.10 (2021.1.10).
See #4206
2022-11-18 10:22:09 +01:00
Mark Paluch
80d63a576c Prepare 3.3.10 (2021.1.10).
See #4206
2022-11-18 10:21:05 +01:00
Mark Paluch
0f6ee3ddbc Upgrade to Java 8u345-b01-jdk-focal and pin base image distribution.
See #4206
2022-10-31 13:20:07 +01:00
Mark Paluch
c3a5f325d2 Use correct boolean type for JSON Schema creation.
We now use the correct JSON type boolean again when creating schemas. Furthermore, we use the bool type for MongoDB $type queries.

Closes #4220
2022-10-27 10:17:55 +02:00
Spring Builds
957397eff9 After release cleanups.
See #4205
2022-10-13 12:37:05 +00:00
Spring Builds
f322d5b18f Prepare next development iteration.
See #4205
2022-10-13 12:36:52 +00:00
Spring Builds
24e423b8d1 Release version 3.3.9 (2021.1.9).
See #4205
2022-10-13 11:54:15 +00:00
Spring Builds
4d5f3c66d5 Prepare 3.3.9 (2021.1.9).
See #4205
2022-10-13 11:51:37 +00:00
Spring Builds
f29d03f9c8 After release cleanups.
See #4172
2022-10-13 07:52:07 +00:00
Spring Builds
fbf4726e56 Prepare next development iteration.
See #4172
2022-10-13 07:51:54 +00:00
Spring Builds
85382f0dd8 Release version 3.3.8 (2021.1.8).
See #4172
2022-10-13 07:28:21 +00:00
Spring Builds
739b44f6e5 Prepare 3.3.8 (2021.1.8).
See #4172
2022-10-13 07:25:55 +00:00
Christoph Strobl
49cd518647 Update tests.
Original Pull Request: #4196
2022-10-11 09:55:54 +02:00
gongxuanzhang
b59c7f774f Fix json schema type name for boolean.
Was boolean should have been bool.

Closes: #4196
2022-10-11 09:55:08 +02:00
Christoph Strobl
b7ac5f7970 Update reactive transaction sample in reference documentation.
Closes: #4190
2022-10-06 13:18:44 +02:00
Christoph Ahlers
9af1689fbf Fix javadoc parameter names.
Closes: #4179
2022-10-04 12:39:48 +02:00
Wan Bachtiar
51ca3be48f Fix typo in reference documentation.
Closes: #4180
2022-10-04 12:25:43 +02:00
Seungwoo Jo
8f8e9c6585 Fix documentation typo in BasicQuery.
Closes #4169
Original pull request: #4170.
2022-09-21 10:58:22 +02:00
Spring Builds
092217e425 After release cleanups.
See #4115
2022-09-19 09:04:54 +00:00
Spring Builds
9d1e1b8c17 Prepare next development iteration.
See #4115
2022-09-19 09:04:41 +00:00
Spring Builds
5cf73168f7 Release version 3.3.7 (2021.1.7).
See #4115
2022-09-19 08:40:19 +00:00
Spring Builds
792d7199f0 Prepare 3.3.7 (2021.1.7).
See #4115
2022-09-19 08:38:01 +00:00
Christoph Strobl
34a2d09303 Fix issue with reference conversion in updates.
We now make sure to convert references in update operations targeting collection like fields when using eg. the push modifier.

Closes #4041
Original pull request: #4045.
2022-09-19 08:53:43 +02:00
Mark Paluch
26554e3031 Polishing.
See #4061
Original pull request: #4062.
2022-09-16 14:53:08 +02:00
Christoph Strobl
018fe623c0 Improve exception message when deriving collection name from type.
We now provide a better worded exception message when trying to derive the collection name for a type that is not considered a user types (such as org.bson.Document).
Update the Javadoc to hint to the error.

Closes #4061
Original pull request: #4062.
2022-09-16 14:53:06 +02:00
Christoph Strobl
27e6b5a9be Initialize lists with size where possible.
Closes #3941
Original pull request: #3974.
2022-09-16 14:44:11 +02:00
Mark Paluch
c9be849e62 Polishing.
Reformat code.

See #4167.
Original pull request: #4168.
2022-09-16 14:41:29 +02:00
Christoph Strobl
da5f24981c Fix usage of change stream option startAfter.
We now make sure to apply the token to startAfter method of the driver. Before this change it had been incorrectly applied to resumeAfter.

Closes #4167.
Original pull request: #4168.
2022-09-16 14:41:29 +02:00
Mark Paluch
7c7b05f10d Polishing.
Fix generics. Add warning suppressions for nullability checks.

See: #4104
Original pull request: #4156.
2022-09-14 14:07:27 +02:00
Christoph Strobl
ed4f30ab07 Fix GeoJson polygon conversion for polygons with inner ring.
Closes: #4104
Original pull request: #4156.
2022-09-14 14:07:26 +02:00
Christoph Strobl
cf38ba15bf Allow referencing the $id field of dbrefs within an aggregation pipeline.
Closes: #4123
Original pull request: #4125.
2022-08-05 14:10:58 +02:00
Sojin
daf12a6e2b Fix AKNOWLEDGED typo in reference documentation.
Two typos found have been updated

Closes #4135
2022-08-05 14:08:15 +02:00
Christoph Strobl
d0481d089e After release cleanups.
See #4090
2022-07-15 10:47:35 +02:00
Christoph Strobl
169c35789d Prepare next development iteration.
See #4090
2022-07-15 10:47:31 +02:00
Christoph Strobl
0de55deb03 Release version 3.3.6 (2021.1.6).
See #4090
2022-07-15 10:34:16 +02:00
Christoph Strobl
01ac35fa31 Prepare 3.3.6 (2021.1.6).
See #4090
2022-07-15 10:33:37 +02:00
Mark Paluch
f5c0318a14 Avoid duplicate bean registrations in MappingMongoConverterParser.
We now ensure to not override `ValidatingMongoEventListener` and `LocalValidatorFactoryBean` bean definitions by avoiding duplicate registrations and checking whether a bean with the given name is already registered.

Closes #4087
2022-06-28 10:25:15 +02:00
Mark Paluch
7a0debe335 After release cleanups.
See #4029
2022-06-20 11:12:22 +02:00
Mark Paluch
0a79ad6585 Prepare next development iteration.
See #4029
2022-06-20 11:12:19 +02:00
Mark Paluch
404ce6a987 Release version 3.3.5 (2021.1.5).
See #4029
2022-06-20 10:59:10 +02:00
Mark Paluch
232a9c9943 Prepare 3.3.5 (2021.1.5).
See #4029
2022-06-20 10:58:43 +02:00
Christoph Strobl
7c5ac764b3 Retain parameter type when binding parameters in annotated Query/Aggregation.
This commit ensures the parameter type is preserved when binding parameters used within the value of the Query or Aggregation annotation

Closes: #4089
2022-06-20 10:37:13 +02:00
Mark Paluch
864c94f490 Wrap SpEL documentation with admonition.
Closes #4085
2022-06-14 09:19:25 +02:00
Mark Paluch
ebc4678aa3 Polishing.
Reformat asciidoc source.

See #4085
2022-06-14 09:19:22 +02:00
Mark Paluch
28d6e67686 Upgrade to Maven Wrapper 3.8.5.
See #4075
2022-06-03 14:40:37 +02:00
John Blum
ca229cdb99 Remove Docker Registry login.
Closes #4056.
2022-05-16 13:00:16 -07:00
Mark Paluch
611100e6f4 Update driver compatibility matrix.
Closes #4052
2022-05-16 15:12:29 +02:00
Christoph Strobl
3a6d6bbfed Polishing.
Update Query javadoc.

Original Pull Request: #3999
2022-05-10 16:34:11 +02:00
Raul Mello Silva
286ff1c4a1 Update Query.limit javadoc.
This commit explains usage of Query.limit(int), which will be set to unlimited when set to zero or a negative value.

Closes: #3999
2022-05-10 16:33:50 +02:00
Christoph Strobl
5df195db15 Provide additional meta information via pom.xml
Add scm & issueManagement.

Closes: #4048
2022-05-10 12:39:52 +02:00
nniesen
e437865707 Update spring.io project urls.
This commit updates outdated projects.spring.io links to spring.io/projects.

Closes: #4042
2022-05-09 13:59:05 +02:00
Christoph Strobl
13888ab7cd After release cleanups.
See #4001
2022-04-19 12:13:23 +02:00
Christoph Strobl
ced6a1b190 Prepare next development iteration.
See #4001
2022-04-19 12:13:20 +02:00
Christoph Strobl
e704f147ad Release version 3.3.4 (2021.1.4).
See #4001
2022-04-19 12:03:13 +02:00
Christoph Strobl
67ea27d9e4 Prepare 3.3.4 (2021.1.4).
See #4001
2022-04-19 12:02:43 +02:00
Mark Paluch
a9ed00530f After release cleanups.
See #3972
2022-03-21 15:06:37 +01:00
Mark Paluch
75f73756c9 Prepare next development iteration.
See #3972
2022-03-21 15:06:35 +01:00
Mark Paluch
4b137cfd55 Release version 3.3.3 (2021.1.3).
See #3972
2022-03-21 14:58:52 +01:00
Mark Paluch
762aa62b2a Prepare 3.3.3 (2021.1.3).
See #3972
2022-03-21 14:58:27 +01:00
Mark Paluch
8f4a8dcbee Use Java 8 to build snapshots for Artifactory.
Closes #3976
2022-03-15 14:33:04 +01:00
Mark Paluch
0aa92031a3 Polishing.
Add missing Override annotations to template API methods.

See #3984
2022-03-11 15:17:52 +01:00
Christoph Strobl
8aa52c129c Modify visibility of methods in TypedJsonSchemaObject.
Change visibility to public as it should have been in first place.

Closes: #3989
2022-03-10 09:23:13 +01:00
sangyongchoi
03ac725080 Remove duplicate condition in GeoConverters.
Closes: #3981
2022-03-03 12:51:25 +01:00
Mark Paluch
5258b36080 Update CI properties.
See #3972
2022-02-22 14:09:14 +01:00
Mark Paluch
ddbec07643 Upgrade to Maven Wrapper 3.8.4.
See #3978
2022-02-22 13:56:07 +01:00
Mark Paluch
f4375fc54d Polishing.
Externalize artifactory credentials identifier.

See #3976
2022-02-22 09:17:10 +01:00
Mark Paluch
ccbd18ff6a Use Java 17 to build snapshots for Artifactory.
Closes #3976
2022-02-22 09:17:08 +01:00
Mark Paluch
490ef81d7b After release cleanups.
See #3935
2022-02-18 10:49:01 +01:00
Mark Paluch
92668635b1 Prepare next development iteration.
See #3935
2022-02-18 10:48:59 +01:00
Mark Paluch
92a07fd024 Release version 3.3.2 (2021.1.2).
See #3935
2022-02-18 10:41:00 +01:00
Mark Paluch
7ebc7d08ed Prepare 3.3.2 (2021.1.2).
See #3935
2022-02-18 10:40:38 +01:00
Christoph Strobl
1fabfe0385 Serialize values for debug output safely in AbstractMongoEventListener.
We now make sure that codec configuration will not cause an exception when debug logging is turned on.

Resolves: #3968
Original Pull Request: #3970
2022-02-18 10:12:55 +01:00
Christoph Strobl
1849afd78b Update copyright year to 2022.
See: #3966
2022-02-16 10:24:07 +01:00
Greg L. Turnquist
8f38113906 Update CI properties.
See #3935
2022-02-14 14:39:49 -06:00
blu10ph
39593a0388 Avoid obtaining mapped sort multiple times for mapReduce.
Apply already mapped sort for map reduce instead of running the source document through the mapping layer again.

Closes: #3960
2022-02-11 11:24:58 +01:00
Christoph Strobl
36e639fa51 Favor Base64Utils over bson internal Base64 type.
org.bson.internal.Base64 is no longer available in MongoDB driver 4.5.0.

Related to: #3962
2022-02-11 08:27:23 +01:00
Christoph Strobl
087bef0f0a Upgrade to MongoDB driver 4.4.2.
Closes: #3958
2022-02-09 08:05:03 +01:00
Greg L. Turnquist
899fb5ee0c Use Harbor Proxy for containers.
Leverage internal infrastructure for pulling Docker container images. Reduces pressure on Docker Hub and reduces risk of hitting rate limits.

See #3954.
Related https://github.com/spring-projects/spring-data-build/issues/1630.
2022-02-07 10:56:55 -06:00
Mark Paluch
68530f0e45 Polishing.
Extract docker credentials into properties file.
Use tabs for indentation instead of spaces.

See #3949
2022-02-03 15:45:55 +01:00
Greg L. Turnquist
742cc9e983 Externalize build properties.
By reading a properties file from an external location, it is possible to inject a consistent set of properties from Spring Data Build. This also supports repeatable builds.

Closes #3949.
2022-02-03 15:13:43 +01:00
Mihail Cornescu
c9657c3aa4 Add IgnoreCase to repository queries documentation.
Update reference documentaion and add missing IgnoreCase keyword.

Closes: #3916
Original Pull Request: #3950
2022-02-02 13:20:02 +01:00
Christoph Strobl
d340125ed5 After release cleanups.
See #3877
2022-01-14 10:45:00 +01:00
Christoph Strobl
5a702b1624 Prepare next development iteration.
See #3877
2022-01-14 10:44:57 +01:00
Christoph Strobl
4b92ecc337 Release version 3.3.1 (2021.1.1).
See #3877
2022-01-14 10:28:56 +01:00
Christoph Strobl
78899c757f Prepare 3.3.1 (2021.1.1).
See #3877
2022-01-14 10:28:25 +01:00
Christoph Strobl
a3861b607a Avoid schema keyId uuid representation errors.
To avoid driver configuration specific UUID representation format errors (binary subtype 3 vs. subtype 4) we now directly convert the given key into its subtype 4 format.

Resolves: #3929
Original pull request: #3931.
2022-01-13 15:26:58 +01:00
Mark Paluch
ee4160997b Polishing.
Simplify assertions, reformat code.

See #3921
Original pull request: #3930.
2022-01-13 11:05:41 +01:00
Christoph Strobl
aeeac56d19 Use index instead of iterator to map position and map keys for updates.
This commit removes usage of the iterator and replaces map key and positional parameter mappings with an index based token lookup.

Closes #3921
Original pull request: #3930.
2022-01-13 11:05:41 +01:00
Mark Paluch
e9c15eb169 Polishing.
Reformat code. Tweak documentation wording and callout syntax.

See #3914, see #3901
Original pull request: #3915.
2022-01-12 15:59:05 +01:00
Christoph Strobl
7f223d1332 Avoid creating invalid index definitions for Map-like properties.
This commit makes sure to exclude Map like structures from index inspection unless annotated with WilcardIndexed.

Closes #3914, closes #3901
Original pull request: #3915.
2022-01-12 15:57:40 +01:00
Mark Paluch
cffee123dc Polishing.
Add author tags, extend copyright license years, simplify tests.

See #3892
2022-01-12 15:35:26 +01:00
rolag-it
352376166a Fix pagination with reactive fluent Querydsl query definition.
Pageable object was not passed to Query, so fetchPage retrieved erroneously the whole dataset as Page content.

Closes #3892
2022-01-12 15:35:26 +01:00
Hett
64b0096c7b Avoid double call of fetch method in DefaultReferenceResolver.
This commit fixes an issue where the fetch method is called twice when looking up singe value references.

Resolves: #3918
Original Pull Request: #3919
2022-01-11 09:45:12 +01:00
Mark Paluch
fb905761a0 Upgrade to MongoDB driver 4.4.1.
Closes #3926
2022-01-11 09:45:12 +01:00
Mark Paluch
1e40448b70 Polishing.
Tweak Javadoc.

See #3898
Original pull request: #3904.
2021-12-14 09:36:56 +01:00
Christoph Strobl
530912d07f Fix field inclusion in aggregation project operation.
Closes #3898
Original pull request: #3904.
2021-12-14 09:36:55 +01:00
Mark Paluch
82331451ea Propagate Bean ClassLoader to MongoTypeMapper.
We now set the ClassLoader from the ApplicationContext to the type mapper to ensure the type mapper has access to entities. Previously, `SimpleTypeInformationMapper` used the contextual class loader and that failed in Fork/Join-Pool threads such as parallel streams as ForkJoinPool uses the system classloader. Running e.g. a packaged Boot application sets up an application ClassLoader that has access to packaged code while the system ClassLoader does not.

Also, consistently access the MongoTypeMapper through its getter.

Closes #3905
2021-12-09 11:34:11 +01:00
Jens Schauder
6899567c01 Update build trigger to use branch build.
See #3865
2021-11-12 14:43:55 +01:00
Jens Schauder
0d869b3c23 After release cleanups.
See #3865
2021-11-12 11:00:06 +01:00
Jens Schauder
d8ef0db1a9 Prepare next development iteration.
See #3865
2021-11-12 11:00:03 +01:00
459 changed files with 11949 additions and 7528 deletions

View File

@@ -1,2 +1,2 @@
#Mon Oct 11 14:30:24 CEST 2021
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.3/apache-maven-3.8.3-bin.zip
#Fri Jun 03 09:42:19 CEST 2022
distributionUrl=https\://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.8.5/apache-maven-3.8.5-bin.zip

184
Jenkinsfile vendored
View File

@@ -9,7 +9,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/3.0.x", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/2.6.x", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -20,7 +20,64 @@ pipeline {
stages {
stage("Docker images") {
parallel {
stage('Publish JDK (Java 17) + MongoDB 4.4') {
stage('Publish JDK (main) + MongoDB 4.0') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-4.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.0.version']} ci/openjdk8-mongodb-4.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (main) + MongoDB 4.4') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-4.4/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk8-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (main) + MongoDB 5.0') {
when {
anyOf {
changeset "ci/openjdk8-mongodb-5.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk8-mongodb-5.0/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (LTS) + MongoDB 4.4') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-4.4/**"
@@ -32,26 +89,7 @@ pipeline {
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
}
}
}
stage('Publish JDK (Java 17) + MongoDB 5.0') {
when {
anyOf {
changeset "ci/openjdk17-mongodb-5.0/**"
changeset "ci/pipeline.properties"
}
}
agent { label 'data' }
options { timeout(time: 30, unit: 'MINUTES') }
steps {
script {
def image = docker.build("springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}", "--build-arg BASE=${p['docker.java.main.image']} --build-arg MONGODB=${p['docker.mongodb.5.0.version']} ci/openjdk17-mongodb-5.0/")
def image = docker.build("springci/spring-data-with-mongodb-4.4:${p['java.lts.tag']}", "--build-arg BASE=${p['docker.java.lts.image']} --build-arg MONGODB=${p['docker.mongodb.4.4.version']} ci/openjdk17-mongodb-4.4/")
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
image.push()
}
@@ -61,7 +99,7 @@ pipeline {
}
}
stage("test: baseline (Java 17)") {
stage("test: baseline (main)") {
when {
beforeAgent(true)
anyOf {
@@ -78,15 +116,13 @@ pipeline {
}
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -101,8 +137,7 @@ pipeline {
}
}
parallel {
stage("test: mongodb 5.0 (Java 17)") {
stage("test: mongodb 4.4 (main)") {
agent {
label 'data'
}
@@ -112,15 +147,57 @@ pipeline {
}
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
stage("test: mongodb 5.0 (main)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-5.0:${p['java.main.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
}
stage("test: baseline (LTS)") {
agent {
label 'data'
}
options { timeout(time: 30, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials("${p['artifactory.credentials']}")
}
steps {
script {
docker.image("harbor-repo.vmware.com/dockerhub-proxy-cache/springci/spring-data-with-mongodb-4.4:${p['java.lts.tag']}").inside(p['docker.java.inside.basic']) {
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --setParameter transactionLifetimeLimitSeconds=90 --setParameter maxTransactionLockRequestTimeoutMillis=10000 --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml clean dependency:list test -Duser.name=jenkins -Dsort -U -B'
}
}
}
@@ -147,18 +224,15 @@ pipeline {
steps {
script {
docker.withRegistry(p['docker.registry'], p['docker.credentials']) {
docker.image(p['docker.java.main.image']).inside(p['docker.java.inside.basic']) {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -v'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
docker.image(p['docker.java.main.image']).inside(p['docker.java.inside.basic']) {
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -s settings.xml -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}

View File

@@ -1,8 +1,8 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://spring.io/projects/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://spring.io/projects/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmain&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The primary goal of the https://spring.io/projects/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities.
The Spring Data MongoDB project provides integration with the MongoDB document database.
@@ -115,7 +115,7 @@ Use `<mongo:client-settings cluster-hosts="..." />` instead
| `<mongo:db-factory writeConcern="..." />`
| NONE, NORMAL, SAFE, FSYNC_SAFE, REPLICAS_SAFE, MAJORITY
| W1, W2, W3, UNAKNOWLEDGED, AKNOWLEDGED, JOURNALED, MAJORITY
| W1, W2, W3, UNACKNOWLEDGED, ACKNOWLEDGED, JOURNALED, MAJORITY
|===
.Removed XML Namespace Elements and Attributes:
@@ -271,7 +271,7 @@ The https://spring.io/[spring.io] site contains several guides that show how to
[[building-from-source]]
== Building from Source
You do not need to build from source to use Spring Data. Binaries are available in https://repo.spring.io[repo.spring.io]
You do not need to build from source to use Spring Data. Binaries are available in https://repo.spring.io[repo.spring.io].
and accessible from Maven using the Maven configuration noted <<maven-configuration,above>>.
NOTE: Configuration for Gradle is similar to Maven.
@@ -281,17 +281,16 @@ Follow this https://start.spring.io/#type=maven-project&language=java&platformVe
to build an imperative application and this https://start.spring.io/#type=maven-project&language=java&platformVersion=2.5.4&packaging=jar&jvmVersion=1.8&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=data-mongodb-reactive[link]
to build a reactive one.
However, if you want to try out the latest and greatest, Spring Data MongoDB can be easily built with the https://github.com/takari/maven-wrapper[Maven wrapper]
and minimally, JDK 8 (https://www.oracle.com/java/technologies/downloads/[JDK downloads]).
However, if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper]
and minimally JDK 8 (https://www.oracle.com/java/technologies/downloads/[JDK downloads]).
In order to build Spring Data MongoDB, you will need to https://www.mongodb.com/try/download/community[download]
In order to build Spring Data MongoDB, first you will need to https://www.mongodb.com/try/download/community[download]
and https://docs.mongodb.com/manual/installation/[install a MongoDB distribution].
Once you have installed MongoDB, you need to start a MongoDB server. It is convenient to set an environment variable to
your MongoDB installation directory (e.g. `MONGODB_HOME`).
your MongoDB installation (e.g. `MONGODB_HOME`).
To run the full test suite, a https://docs.mongodb.com/manual/tutorial/deploy-replica-set/[MongoDB Replica Set]
is required.
To run the full test suite a https://docs.mongodb.com/manual/tutorial/deploy-replica-set/[MongoDB Replica Set] is required.
To run the MongoDB server enter the following command from a command-line:
@@ -332,7 +331,7 @@ In case you need to, you can adjust the `ulimit` with the following command (327
$ ulimit -n 32768
----
You can use `ulimit -a` again to verify the `ulimit` for "_open files_" was set appropriately.
You can use `ulimit -a` again to verify the `ulimit` on "_open files_" was set appropriately.
Now you are ready to build Spring Data MongoDB. Simply enter the following `mvnw` (Maven Wrapper) command:

View File

@@ -0,0 +1,22 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -7,15 +7,16 @@ ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/http/https/g' /etc/apt/sources.list ; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 ; \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 ; \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list; \
echo ${TZ} > /etc/timezone;
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update ; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
apt-get clean; \
rm -rf /var/lib/apt/lists/*;
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -7,17 +7,16 @@ ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list; \
sed -i -e 's/http/https/g' /etc/apt/sources.list ; \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget ; \
# MongoDB 5.0 release signing key
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv B00A0BD1E2C63C11 ; \
# Needed when MongoDB creates a 5.0 folder.
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/5.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-5.0.list; \
echo ${TZ} > /etc/timezone;
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4 && \
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update; \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} ; \
apt-get clean; \
rm -rf /var/lib/apt/lists/*;
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,24 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 && \
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 656408E390CFB1F5 && \
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/4.4 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.4.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
ln -T /bin/true /usr/bin/systemctl && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
rm /usr/bin/systemctl && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,24 @@
ARG BASE
FROM ${BASE}
# Any ARG statements before FROM are cleared.
ARG MONGODB
ENV TZ=Etc/UTC
ENV DEBIAN_FRONTEND=noninteractive
RUN set -eux; \
sed -i -e 's/archive.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/security.ubuntu.com/mirror.one.com/g' /etc/apt/sources.list && \
sed -i -e 's/ports.ubuntu.com/mirrors.ocf.berkeley.edu/g' /etc/apt/sources.list && \
sed -i -e 's/http/https/g' /etc/apt/sources.list && \
apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2 wget && \
# MongoDB 5.0 release signing key
apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv B00A0BD1E2C63C11 && \
# Needed when MongoDB creates a 5.0 folder.
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/5.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-5.0.list && \
echo ${TZ} > /etc/timezone
RUN apt-get update && \
apt-get install -y mongodb-org=${MONGODB} mongodb-org-server=${MONGODB} mongodb-org-shell=${MONGODB} mongodb-org-mongos=${MONGODB} mongodb-org-tools=${MONGODB} && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*

View File

@@ -1,10 +1,15 @@
# Java versions
java.main.tag=17.0.2_8-jdk
java.main.tag=8u345-b01-jdk-focal
java.next.tag=11.0.16.1_1-jdk-focal
java.lts.tag=17.0.4.1_1-jdk-focal
# Docker container images - standard
docker.java.main.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.main.tag}
docker.java.next.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.next.tag}
docker.java.lts.image=harbor-repo.vmware.com/dockerhub-proxy-cache/library/eclipse-temurin:${java.lts.tag}
# Supported versions of MongoDB
docker.mongodb.4.0.version=4.0.28
docker.mongodb.4.4.version=4.4.12
docker.mongodb.5.0.version=5.0.6

26
pom.xml
View File

@@ -5,17 +5,17 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-M2</version>
<version>3.3.11-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>https://projects.spring.io/spring-data-mongodb</url>
<url>https://spring.io/projects/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>3.0.0-M2</version>
<version>2.6.11-SNAPSHOT</version>
</parent>
<modules>
@@ -24,11 +24,10 @@
</modules>
<properties>
<source.level>16</source.level>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>3.0.0-M2</springdata.commons>
<mongo>4.5.0</mongo>
<springdata.commons>2.6.11-SNAPSHOT</springdata.commons>
<mongo>4.4.2</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -113,6 +112,17 @@
</developer>
</developers>
<scm>
<connection>scm:git:https://github.com/spring-projects/spring-data-mongodb.git</connection>
<developerConnection>scm:git:git@github.com:spring-projects/spring-data-mongodb.git</developerConnection>
<url>https://github.com/spring-projects/spring-data-mongodb</url>
</scm>
<issueManagement>
<system>GitHub</system>
<url>https://github.com/spring-projects/spring-data-mongodb/issues</url>
</issueManagement>
<profiles>
<profile>
<id>benchmarks</id>
@@ -135,8 +145,8 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-M2</version>
<version>3.3.11-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-M2</version>
<version>3.3.11-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>4.0.0-M2</version>
<version>3.3.11-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -122,6 +122,27 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava2}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava3</groupId>
<artifactId>rxjava</artifactId>
@@ -131,6 +152,12 @@
<!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jcdi_2.0_spec</artifactId>
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
@@ -140,48 +167,31 @@
</dependency>
<dependency>
<groupId>jakarta.enterprise</groupId>
<artifactId>jakarta.enterprise.cdi-api</artifactId>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>${cdi}</version>
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>jakarta.annotation</groupId>
<artifactId>jakarta.annotation-api</artifactId>
<version>${jakarta-annotation-api}</version>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
<version>${javax-annotation-api}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-se</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-spi</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-impl</artifactId>
<classifier>jakarta</classifier>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<!-- JSR 303 Validation -->
<dependency>
<groupId>jakarta.validation</groupId>
<artifactId>jakarta.validation-api</artifactId>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>${validation}</version>
<optional>true</optional>
</dependency>
@@ -196,23 +206,28 @@
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>7.0.1.Final</version>
<version>5.4.3.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>jakarta.el</groupId>
<artifactId>jakarta.el-api</artifactId>
<version>4.0.0</version>
<scope>provided</scope>
<optional>true</optional>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>jakarta.el</artifactId>
<version>4.0.2</version>
<scope>provided</scope>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
@@ -222,6 +237,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
@@ -257,9 +279,9 @@
</dependency>
<dependency>
<groupId>jakarta.transaction</groupId>
<artifactId>jakarta.transaction-api</artifactId>
<version>2.0.0</version>
<groupId>javax.transaction</groupId>
<artifactId>jta</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>

View File

@@ -103,11 +103,19 @@ public class BindableMongoExpression implements MongoExpression {
return new BindableMongoExpression(expressionString, codecRegistryProvider, args);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoExpression#toDocument()
*/
@Override
public Document toDocument() {
return target.get();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "BindableMongoExpression{" + "expressionString='" + expressionString + '\'' + ", args="

View File

@@ -193,11 +193,19 @@ public class MongoDatabaseUtils {
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) {
@@ -206,6 +214,10 @@ public class MongoDatabaseUtils {
}
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#afterCompletion(int)
*/
@Override
public void afterCompletion(int status) {
@@ -216,6 +228,10 @@ public class MongoDatabaseUtils {
super.afterCompletion(status);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -0,0 +1,57 @@
/*
* Copyright 2011-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link MongoDatabase} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 3.0, use {@link MongoDatabaseFactory} instead.
*/
@Deprecated
public interface MongoDbFactory extends MongoDatabaseFactory {
/**
* Creates a default {@link MongoDatabase} instance.
*
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase()} instead.
*/
@Deprecated
default MongoDatabase getDb() throws DataAccessException {
return getMongoDatabase();
}
/**
* Obtain a {@link MongoDatabase} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return never {@literal null}.
* @throws DataAccessException
* @deprecated since 3.0. Use {@link #getMongoDatabase(String)} instead.
*/
@Deprecated
default MongoDatabase getDb(String dbName) throws DataAccessException {
return getMongoDatabase(dbName);
}
}

View File

@@ -106,6 +106,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
this.options = options;
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doGetTransaction()
*/
@Override
protected Object doGetTransaction() throws TransactionException {
@@ -114,11 +118,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return new MongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doBegin(java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException {
@@ -148,6 +160,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSuspend(java.lang.Object)
*/
@Override
protected Object doSuspend(Object transaction) throws TransactionException {
@@ -157,11 +173,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doResume(java.lang.Object, java.lang.Object)
*/
@Override
protected void doResume(@Nullable Object transaction, Object suspendedResources) {
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCommit(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected final void doCommit(DefaultTransactionStatus status) throws TransactionException {
@@ -212,6 +236,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doRollback(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doRollback(DefaultTransactionStatus status) throws TransactionException {
@@ -231,6 +259,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
}
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSetRollbackOnly(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException {
@@ -238,6 +270,10 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
transactionObject.getRequiredResourceHolder().setRollbackOnly();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCleanupAfterCompletion(java.lang.Object)
*/
@Override
protected void doCleanupAfterCompletion(Object transaction) {
@@ -289,11 +325,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return dbFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceTransactionManager#getResourceFactory()
*/
@Override
public MongoDatabaseFactory getResourceFactory() {
return getRequiredDbFactory();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDbFactory();
@@ -461,11 +505,19 @@ public class MongoTransactionManager extends AbstractPlatformTransactionManager
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
TransactionSynchronizationUtils.triggerFlush();

View File

@@ -214,11 +214,19 @@ public class ReactiveMongoDatabaseUtils {
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected Mono<Void> processResourceAfterCommit(ReactiveMongoResourceHolder resourceHolder) {
@@ -229,6 +237,10 @@ public class ReactiveMongoDatabaseUtils {
return Mono.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#afterCompletion(int)
*/
@Override
public Mono<Void> afterCompletion(int status) {
@@ -244,6 +256,10 @@ public class ReactiveMongoDatabaseUtils {
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.ReactiveResourceSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> releaseResource(ReactiveMongoResourceHolder resourceHolder, Object resourceKey) {

View File

@@ -110,6 +110,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doGetTransaction(org.springframework.transaction.reactive.TransactionSynchronizationManager)
*/
@Override
protected Object doGetTransaction(TransactionSynchronizationManager synchronizationManager)
throws TransactionException {
@@ -119,11 +123,19 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return new ReactiveMongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doBegin(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected Mono<Void> doBegin(TransactionSynchronizationManager synchronizationManager, Object transaction,
TransactionDefinition definition) throws TransactionException {
@@ -163,6 +175,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSuspend(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Object> doSuspend(TransactionSynchronizationManager synchronizationManager, Object transaction)
throws TransactionException {
@@ -176,6 +192,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doResume(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object, java.lang.Object)
*/
@Override
protected Mono<Void> doResume(TransactionSynchronizationManager synchronizationManager, @Nullable Object transaction,
Object suspendedResources) {
@@ -183,6 +203,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
.fromRunnable(() -> synchronizationManager.bindResource(getRequiredDatabaseFactory(), suspendedResources));
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCommit(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected final Mono<Void> doCommit(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -219,6 +243,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return transactionObject.commitTransaction();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doRollback(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doRollback(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) {
@@ -240,6 +268,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doSetRollbackOnly(org.springframework.transaction.reactive.TransactionSynchronizationManager, org.springframework.transaction.reactive.GenericReactiveTransaction)
*/
@Override
protected Mono<Void> doSetRollbackOnly(TransactionSynchronizationManager synchronizationManager,
GenericReactiveTransaction status) throws TransactionException {
@@ -250,6 +282,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
});
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.reactive.AbstractReactiveTransactionManager#doCleanupAfterCompletion(org.springframework.transaction.reactive.TransactionSynchronizationManager, java.lang.Object)
*/
@Override
protected Mono<Void> doCleanupAfterCompletion(TransactionSynchronizationManager synchronizationManager,
Object transaction) {
@@ -304,6 +340,10 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return databaseFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDatabaseFactory();
@@ -469,11 +509,19 @@ public class ReactiveMongoTransactionManager extends AbstractReactiveTransaction
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
throw new UnsupportedOperationException("flush() not supported");

View File

@@ -95,6 +95,10 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
this.sessionType = sessionType;
}
/*
* (non-Javadoc)
* @see org.aopalliance.intercept.MethodInterceptor(org.aopalliance.intercept.MethodInvocation)
*/
@Nullable
@Override
public Object invoke(MethodInvocation methodInvocation) throws Throwable {

View File

@@ -15,8 +15,8 @@
*/
package org.springframework.data.mongodb;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.util.Version;
import org.springframework.util.StringUtils;
@@ -31,7 +31,7 @@ import com.mongodb.MongoDriverInformation;
*/
public class SpringDataMongoDB {
private static final Log LOGGER = LogFactory.getLog(SpringDataMongoDB.class);
private static final Logger LOGGER = LoggerFactory.getLogger(SpringDataMongoDB.class);
private static final Version FALLBACK_VERSION = new Version(3);
private static final MongoDriverInformation DRIVER_INFORMATION = MongoDriverInformation
@@ -68,7 +68,7 @@ public class SpringDataMongoDB {
try {
return Version.parse(versionString);
} catch (Exception e) {
LOGGER.debug(String.format("Cannot read Spring Data MongoDB version '%s'.", versionString));
LOGGER.debug("Cannot read Spring Data MongoDB version '{}'.", versionString);
}
return FALLBACK_VERSION;

View File

@@ -25,7 +25,9 @@ import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.lang.Nullable;
import com.mongodb.MongoClientSettings;
import com.mongodb.MongoClientSettings.Builder;
@@ -78,6 +80,24 @@ public abstract class AbstractMongoClientConfiguration extends MongoConfiguratio
return new SimpleMongoClientDatabaseFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoClientConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext(MongoCustomConversions)}. Will get {@link #customConversions()} applied.

View File

@@ -30,6 +30,10 @@ import com.mongodb.ConnectionString;
*/
public class ConnectionStringPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String connectionString) {

View File

@@ -34,6 +34,10 @@ import org.w3c.dom.Element;
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -42,6 +46,10 @@ class GridFsTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.GRID_FS_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -24,7 +24,6 @@ import java.util.List;
import java.util.Set;
import org.springframework.beans.BeanMetadataElement;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
@@ -64,6 +63,7 @@ import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
@@ -80,7 +80,7 @@ import org.w3c.dom.Element;
public class MappingMongoConverterParser implements BeanDefinitionParser {
private static final String BASE_PACKAGE = "base-package";
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("jakarta.validation.Validator",
private static final boolean JSR_303_PRESENT = ClassUtils.isPresent("javax.validation.Validator",
MappingMongoConverterParser.class.getClassLoader());
/* (non-Javadoc)
@@ -135,9 +135,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
new BeanComponentDefinition(indexOperationsProviderBuilder.getBeanDefinition(), "indexOperationsProvider"));
}
try {
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
} catch (NoSuchBeanDefinitionException ignored) {
if (!registry.containsBeanDefinition(INDEX_HELPER_BEAN_NAME)) {
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
@@ -151,7 +149,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinition validatingMongoEventListener = potentiallyCreateValidatingMongoEventListener(element, parserContext);
if (validatingMongoEventListener != null) {
if (validatingMongoEventListener != null && !registry.containsBeanDefinition(VALIDATING_EVENT_LISTENER_BEAN_NAME)) {
parserContext.registerBeanComponent(
new BeanComponentDefinition(validatingMongoEventListener, VALIDATING_EVENT_LISTENER_BEAN_NAME));
}
@@ -165,15 +163,16 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
String disableValidation = element.getAttribute("disable-validation");
boolean validationDisabled = StringUtils.hasText(disableValidation) && Boolean.valueOf(disableValidation);
boolean validationDisabled = StringUtils.hasText(disableValidation) && Boolean.parseBoolean(disableValidation);
if (!validationDisabled) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition();
RuntimeBeanReference validator = getValidator(builder, parserContext);
RuntimeBeanReference validator = getValidator(element, parserContext);
if (validator != null) {
builder.getRawBeanDefinition().setBeanClass(ValidatingMongoEventListener.class);
builder.getRawBeanDefinition().setSource(element);
builder.addConstructorArgValue(validator);
return builder.getBeanDefinition();
@@ -195,7 +194,6 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
validatorDef.setSource(source);
validatorDef.setRole(BeanDefinition.ROLE_INFRASTRUCTURE);
String validatorName = parserContext.getReaderContext().registerWithGeneratedName(validatorDef);
parserContext.registerBeanComponent(new BeanComponentDefinition(validatorDef, validatorName));
return new RuntimeBeanReference(validatorName);
}
@@ -376,6 +374,10 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
this.delegates = new HashSet<>(Arrays.asList(filters));
}
/*
* (non-Javadoc)
* @see org.springframework.core.type.filter.TypeFilter#match(org.springframework.core.type.classreading.MetadataReader, org.springframework.core.type.classreading.MetadataReaderFactory)
*/
public boolean match(MetadataReader metadataReader, MetadataReaderFactory metadataReaderFactory)
throws IOException {

View File

@@ -47,16 +47,28 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@Override
protected Class<?> getBeanClass(Element element) {
return AuditingEntityCallback.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@Override
protected boolean shouldGenerateId() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/
@Override
protected void doParse(Element element, ParserContext parserContext, BeanDefinitionBuilder builder) {

View File

@@ -18,7 +18,6 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -28,8 +27,6 @@ import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEntityCallback;
import org.springframework.util.Assert;
@@ -42,16 +39,28 @@ import org.springframework.util.Assert;
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "mongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
public void registerBeanDefinitions(AnnotationMetadata annotationMetadata, BeanDefinitionRegistry registry) {
@@ -61,6 +70,10 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
super.registerBeanDefinitions(annotationMetadata, registry);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
@@ -68,13 +81,17 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(org.springframework.data.repository.config.PersistentEntitiesFactoryBean.class);
definition.addConstructorArgValue(new RuntimeBeanReference(MappingContext.class));
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {

View File

@@ -35,6 +35,10 @@ import org.w3c.dom.Element;
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);

View File

@@ -51,6 +51,10 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final String OPTIONS_DELIMITER = "?";
private static final String OPTION_VALUE_DELIMITER = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String text) throws IllegalArgumentException {

View File

@@ -62,6 +62,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -70,6 +74,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -26,6 +26,10 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.NamespaceHandler#init()
*/
public void init() {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());

View File

@@ -40,6 +40,7 @@ import org.w3c.dom.Element;
* @author Christoph Strobl
* @author Mark Paluch
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}

View File

@@ -39,6 +39,10 @@ import org.w3c.dom.Element;
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected String resolveId(Element element, AbstractBeanDefinition definition, ParserContext parserContext)
throws BeanDefinitionStoreException {
@@ -47,6 +51,10 @@ class MongoTemplateParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.MONGO_TEMPLATE_BEAN_NAME;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {

View File

@@ -41,11 +41,19 @@ public class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEnti
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public PersistentEntities getObject() {
return PersistentEntities.of(converter.getMappingContext());
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return PersistentEntities.class;

View File

@@ -38,16 +38,28 @@ import org.springframework.util.Assert;
*/
class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableReactiveMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "reactiveMongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
@@ -62,6 +74,10 @@ class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupp
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {

View File

@@ -32,6 +32,10 @@ import com.mongodb.ReadConcernLevel;
*/
public class ReadConcernPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
@Override
public void setAsText(@Nullable String readConcernString) {

View File

@@ -29,6 +29,10 @@ import com.mongodb.ReadPreference;
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String readPreferenceString) throws IllegalArgumentException {

View File

@@ -21,8 +21,8 @@ import java.net.UnknownHostException;
import java.util.HashSet;
import java.util.Set;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -43,9 +43,13 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
* A port is a number without a leading 0 at the end of the address that is proceeded by just a single :.
*/
private static final String HOST_PORT_SPLIT_PATTERN = "(?<!:):(?=[123456789]\\d*$)";
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address %s '%s'. Check your replica set configuration!";
private static final Log LOG = LogFactory.getLog(ServerAddressPropertyEditor.class);
private static final String COULD_NOT_PARSE_ADDRESS_MESSAGE = "Could not parse address {} '{}'. Check your replica set configuration!";
private static final Logger LOG = LoggerFactory.getLogger(ServerAddressPropertyEditor.class);
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String replicaSetString) {
@@ -84,18 +88,14 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
private ServerAddress parseServerAddress(String source) {
if (!StringUtils.hasText(source)) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
return null;
}
String[] hostAndPort = extractHostAddressAndPort(source.trim());
if (hostAndPort.length > 2) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "source", source);
return null;
}
@@ -105,13 +105,9 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
return port == null ? new ServerAddress(hostAddress) : new ServerAddress(hostAddress, port);
} catch (UnknownHostException e) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "host", hostAndPort[0]);
} catch (NumberFormatException e) {
if(LOG.isWarnEnabled()) {
LOG.warn(String.format(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]));
}
LOG.warn(COULD_NOT_PARSE_ADDRESS_MESSAGE, "port", hostAndPort[1]);
}
return null;

View File

@@ -26,6 +26,10 @@ import com.mongodb.WriteConcern;
*/
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
public WriteConcern convert(String source) {
WriteConcern writeConcern = WriteConcern.valueOf(source);

View File

@@ -29,6 +29,10 @@ import org.springframework.util.StringUtils;
*/
public class UUidRepresentationPropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(@Nullable String value) {

View File

@@ -189,6 +189,10 @@ public class ChangeStreamEvent<T> {
String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType));
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';

View File

@@ -47,6 +47,20 @@ public class CollectionOptions {
private ValidationOptions validationOptions;
private @Nullable TimeSeriesOptions timeSeriesOptions;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated. Can be {@literal null}.
* @param maxDocuments the maximum number of documents in the collection. Can be {@literal null}.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise. Can be {@literal null}.
* @deprecated since 2.0 please use {@link CollectionOptions#empty()} as entry point.
*/
@Deprecated
public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
this(size, maxDocuments, capped, null, ValidationOptions.none(), null);
}
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
@Nullable Collation collation, ValidationOptions validationOptions,
@Nullable TimeSeriesOptions timeSeriesOptions) {

View File

@@ -176,16 +176,20 @@ class CountQuery {
Document $geoWithinMin = new Document("$geoWithin",
new Document(spheric ? "$centerSphere" : "$center", $centerMin));
List<Document> criteria = new ArrayList<>();
List<Document> criteria;
if ($and != null) {
if ($and instanceof Collection) {
criteria.addAll((Collection) $and);
Collection andElements = (Collection) $and;
criteria = new ArrayList<>(andElements.size() + 2);
criteria.addAll(andElements);
} else {
throw new IllegalArgumentException(
"Cannot rewrite query as it contains an '$and' element that is not a Collection!: Offending element: "
+ $and);
}
} else {
criteria = new ArrayList<>(2);
}
criteria.add(new Document("$nor", Collections.singletonList(new Document(key, $geoWithinMin))));

View File

@@ -109,6 +109,10 @@ class DefaultBulkOperations implements BulkOperations {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
@@ -121,6 +125,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
@@ -131,6 +139,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
@@ -141,6 +153,10 @@ class DefaultBulkOperations implements BulkOperations {
return updateOne(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
@@ -153,6 +169,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
@@ -163,6 +183,10 @@ class DefaultBulkOperations implements BulkOperations {
return updateMulti(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
@@ -175,11 +199,19 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
@@ -190,6 +222,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
@@ -203,6 +239,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
@@ -215,6 +255,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#replaceOne(org.springframework.data.mongodb.core.query.Query, java.lang.Object, org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public BulkOperations replaceOne(Query query, Object replacement, FindAndReplaceOptions options) {
@@ -234,6 +278,10 @@ class DefaultBulkOperations implements BulkOperations {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public com.mongodb.bulk.BulkWriteResult execute() {

View File

@@ -112,6 +112,10 @@ public class DefaultIndexOperations implements IndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public String ensureIndex(final IndexDefinition indexDefinition) {
return execute(collection -> {
@@ -146,6 +150,10 @@ public class DefaultIndexOperations implements IndexOperations {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) {
execute(collection -> {
@@ -155,10 +163,18 @@ public class DefaultIndexOperations implements IndexOperations {
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#dropAllIndexes()
*/
public void dropAllIndexes() {
dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() {
return execute(new CollectionCallback<List<IndexInfo>>() {
@@ -172,7 +188,8 @@ public class DefaultIndexOperations implements IndexOperations {
private List<IndexInfo> getIndexData(MongoCursor<Document> cursor) {
List<IndexInfo> indexInfoList = new ArrayList<>();
int available = cursor.available();
List<IndexInfo> indexInfoList = available > 0 ? new ArrayList<>(available) : new ArrayList<>();
while (cursor.hasNext()) {

View File

@@ -42,6 +42,10 @@ class DefaultIndexOperationsProvider implements IndexOperationsProvider {
this.mapper = mapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/
@Override
public IndexOperations indexOps(String collectionName, Class<?> type) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper, type);

View File

@@ -86,6 +86,10 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
this.type = type;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
return mongoOperations.execute(collectionName, collection -> {
@@ -115,14 +119,26 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
.orElse(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> collection.dropIndex(name)).then();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() {
return dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, collection -> collection.listIndexes(Document.class)) //

View File

@@ -70,11 +70,19 @@ class DefaultScriptOperations implements ScriptOperations {
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
@@ -84,6 +92,10 @@ class DefaultScriptOperations implements ScriptOperations {
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
@@ -103,6 +115,10 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
@@ -119,6 +135,10 @@ class DefaultScriptOperations implements ScriptOperations {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
@@ -127,6 +147,10 @@ class DefaultScriptOperations implements ScriptOperations {
return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {

View File

@@ -21,10 +21,8 @@ import java.util.Map;
import java.util.Optional;
import org.bson.Document;
import org.springframework.core.convert.ConversionService;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mapping.IdentifierAccessor;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentEntity;
@@ -32,10 +30,7 @@ import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.core.CollectionOptions.TimeSeriesOptions;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
@@ -44,11 +39,6 @@ import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.timeseries.Granularity;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.projection.EntityProjectionIntrospector;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -57,10 +47,6 @@ import org.springframework.util.MultiValueMap;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.TimeSeriesGranularity;
import com.mongodb.client.model.ValidationOptions;
/**
* Common operations performed on an entity in the context of it's mapping metadata.
*
@@ -76,31 +62,9 @@ class EntityOperations {
private static final String ID_FIELD = "_id";
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
private final QueryMapper queryMapper;
private final EntityProjectionIntrospector introspector;
private final MongoJsonSchemaMapper schemaMapper;
EntityOperations(MongoConverter converter) {
this(converter, new QueryMapper(converter));
}
EntityOperations(MongoConverter converter, QueryMapper queryMapper) {
this(converter, converter.getMappingContext(), converter.getCustomConversions(), converter.getProjectionFactory(),
queryMapper);
}
EntityOperations(MongoConverter converter,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context,
CustomConversions conversions, ProjectionFactory projectionFactory, QueryMapper queryMapper) {
EntityOperations(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
this.context = context;
this.queryMapper = queryMapper;
this.introspector = EntityProjectionIntrospector.create(projectionFactory,
EntityProjectionIntrospector.ProjectionPredicate.typeHierarchy()
.and(((target, underlyingType) -> !conversions.isSimpleType(target))),
context);
this.schemaMapper = new MongoJsonSchemaMapper(converter);
}
/**
@@ -174,7 +138,14 @@ class EntityOperations {
"No class parameter provided, entity collection can't be determined!");
}
return context.getRequiredPersistentEntity(entityClass).getCollection();
MongoPersistentEntity<?> persistentEntity = context.getPersistentEntity(entityClass);
if (persistentEntity == null) {
throw new MappingException(String.format(
"Cannot determine collection name from type '%s'. Is it a store native type?", entityClass.getName()));
}
return persistentEntity.getCollection();
}
public Query getByIdInQuery(Collection<?> entities) {
@@ -265,89 +236,6 @@ class EntityOperations {
return UntypedOperations.instance();
}
/**
* Introspect the given {@link Class result type} in the context of the {@link Class entity type} whether the returned
* type is a projection and what property paths are participating in the projection.
*
* @param resultType the type to project on. Must not be {@literal null}.
* @param entityType the source domain type. Must not be {@literal null}.
* @return the introspection result.
* @since 3.4
* @see EntityProjectionIntrospector#introspect(Class, Class)
*/
public <M, D> EntityProjection<M, D> introspectProjection(Class<M> resultType, Class<D> entityType) {
return introspector.introspect(resultType, entityType);
}
/**
* Convert {@link CollectionOptions} to {@link CreateCollectionOptions} using {@link Class entityType} to obtain
* mapping metadata.
*
* @param collectionOptions
* @param entityType
* @return
* @since 3.4
*/
public CreateCollectionOptions convertToCreateCollectionOptions(@Nullable CollectionOptions collectionOptions,
Class<?> entityType) {
Optional<Collation> collation = Optionals.firstNonEmpty(
() -> Optional.ofNullable(collectionOptions).flatMap(CollectionOptions::getCollation),
() -> forType(entityType).getCollation());//
CreateCollectionOptions result = new CreateCollectionOptions();
collation.map(Collation::toMongoCollation).ifPresent(result::collation);
if (collectionOptions == null) {
return result;
}
collectionOptions.getCapped().ifPresent(result::capped);
collectionOptions.getSize().ifPresent(result::sizeInBytes);
collectionOptions.getMaxDocuments().ifPresent(result::maxDocuments);
collectionOptions.getCollation().map(Collation::toMongoCollation).ifPresent(result::collation);
collectionOptions.getValidationOptions().ifPresent(it -> {
ValidationOptions validationOptions = new ValidationOptions();
it.getValidationAction().ifPresent(validationOptions::validationAction);
it.getValidationLevel().ifPresent(validationOptions::validationLevel);
it.getValidator().ifPresent(val -> validationOptions.validator(getMappedValidator(val, entityType)));
result.validationOptions(validationOptions);
});
collectionOptions.getTimeSeriesOptions().map(forType(entityType)::mapTimeSeriesOptions).ifPresent(it -> {
com.mongodb.client.model.TimeSeriesOptions options = new com.mongodb.client.model.TimeSeriesOptions(
it.getTimeField());
if (StringUtils.hasText(it.getMetaField())) {
options.metaField(it.getMetaField());
}
if (!Granularity.DEFAULT.equals(it.getGranularity())) {
options.granularity(TimeSeriesGranularity.valueOf(it.getGranularity().name().toUpperCase()));
}
result.timeSeriesOptions(options);
});
return result;
}
private Document getMappedValidator(Validator validator, Class<?> domainType) {
Document validationRules = validator.toDocument();
if (validationRules.containsKey("$jsonSchema")) {
return schemaMapper.mapSchema(validationRules, domainType);
}
return queryMapper.getMappedObject(validationRules, context.getPersistentEntity(domainType));
}
/**
* A representation of information about an entity.
*
@@ -492,21 +380,37 @@ class EntityOperations {
this.map = map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return ID_FIELD;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return map.get(ID_FIELD);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
return Query.query(Criteria.where(ID_FIELD).is(map.get(ID_FIELD)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
@@ -516,11 +420,19 @@ class EntityOperations {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion()
*/
@Override
public Query getQueryForVersion() {
throw new MappingException("Cannot query for version on plain Documents!");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
return MappedDocument.of(map instanceof Document //
@@ -528,27 +440,47 @@ class EntityOperations {
: new Document(map));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
return null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MutablePersistableSource#incrementVersion()
*/
@Override
public T incrementVersion() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return map.get(ID_FIELD) != null;
@@ -561,6 +493,10 @@ class EntityOperations {
super(map);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
@SuppressWarnings("unchecked")
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -600,16 +536,28 @@ class EntityOperations {
return new MappedEntity<>(entity, identifierAccessor, propertyAccessor);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
*/
@Override
public String getIdFieldName() {
return entity.getRequiredIdProperty().getFieldName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getId()
*/
@Override
public Object getId() {
return idAccessor.getRequiredIdentifier();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getByIdQuery()
*/
@Override
public Query getByIdQuery() {
@@ -622,6 +570,10 @@ class EntityOperations {
return Query.query(Criteria.where(idProperty.getName()).is(getId()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getQueryForVersion(java.lang.Object)
*/
@Override
public Query getQueryForVersion() {
@@ -632,6 +584,10 @@ class EntityOperations {
.and(versionProperty.getName()).is(getVersion()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#toMappedDocument(org.springframework.data.mongodb.core.convert.MongoWriter)
*/
@Override
public MappedDocument toMappedDocument(MongoWriter<? super T> writer) {
@@ -647,6 +603,10 @@ class EntityOperations {
return MappedDocument.of(document);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#assertUpdateableIdIfNotSet()
*/
public void assertUpdateableIdIfNotSet() {
if (!entity.hasIdProperty()) {
@@ -667,22 +627,38 @@ class EntityOperations {
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#isVersionedEntity()
*/
@Override
public boolean isVersionedEntity() {
return entity.hasVersionProperty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getVersion()
*/
@Override
@Nullable
public Object getVersion() {
return propertyAccessor.getProperty(entity.getRequiredVersionProperty());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getBean()
*/
@Override
public T getBean() {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.Entity#isNew()
*/
@Override
public boolean isNew() {
return entity.isNew(propertyAccessor.getBean());
@@ -717,6 +693,10 @@ class EntityOperations {
new ConvertingPropertyAccessor<>(propertyAccessor, conversionService));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#populateIdIfNecessary(java.lang.Object)
*/
@Nullable
@Override
public T populateIdIfNecessary(@Nullable Object id) {
@@ -738,6 +718,10 @@ class EntityOperations {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.MappedEntity#getVersion()
*/
@Override
@Nullable
public Number getVersion() {
@@ -747,6 +731,10 @@ class EntityOperations {
return propertyAccessor.getProperty(versionProperty, Number.class);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#initializeVersionProperty()
*/
@Override
public T initializeVersionProperty() {
@@ -761,6 +749,10 @@ class EntityOperations {
return propertyAccessor.getBean();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.AdaptibleEntity#incrementVersion()
*/
@Override
public T incrementVersion() {
@@ -832,11 +824,19 @@ class EntityOperations {
return (TypedOperations) INSTANCE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {
@@ -871,11 +871,19 @@ class EntityOperations {
this.entity = entity;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation()
*/
@Override
public Optional<Collation> getCollation() {
return Optional.ofNullable(entity.getCollation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.TypedOperations#getCollation(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public Optional<Collation> getCollation(Query query) {

View File

@@ -15,10 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.util.CloseableIterator;
/**
* {@link ExecutableAggregationOperation} allows creation and execution of MongoDB aggregation operations in a fluent
@@ -89,12 +88,12 @@ public interface ExecutableAggregationOperation {
/**
* Apply pipeline operations as specified and stream all matching elements. <br />
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable}
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable}
*
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
* Never {@literal null}.
*/
Stream<T> stream();
CloseableIterator<T> stream();
}
/**

View File

@@ -15,11 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import java.util.stream.Stream;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.util.CloseableIterator;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -38,6 +37,10 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override
public <T> ExecutableAggregation<T> aggregateAndReturn(Class<T> domainType) {
@@ -66,6 +69,10 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithCollection#inCollection(java.lang.String)
*/
@Override
public AggregationWithAggregation<T> inCollection(String collection) {
@@ -74,6 +81,10 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.AggregationWithAggregation#by(org.springframework.data.mongodb.core.aggregation.Aggregation)
*/
@Override
public TerminatingAggregation<T> by(Aggregation aggregation) {
@@ -82,13 +93,21 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
return new ExecutableAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#all()
*/
@Override
public AggregationResults<T> all() {
return template.aggregate(aggregation, getCollectionName(aggregation), domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation.TerminatingAggregation#stream()
*/
@Override
public Stream<T> stream() {
public CloseableIterator<T> stream() {
return template.aggregateStream(aggregation, getCollectionName(aggregation), domainType);
}

View File

@@ -118,8 +118,8 @@ public interface ExecutableFindOperation {
/**
* Stream all matching elements.
*
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return a {@link Stream} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed. Never
* {@literal null}.
*/
Stream<T> stream();

View File

@@ -20,11 +20,12 @@ import java.util.Optional;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.util.CloseableIterator;
import org.springframework.data.util.StreamUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
@@ -50,6 +51,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation#query(java.lang.Class)
*/
@Override
public <T> ExecutableFind<T> query(Class<T> domainType) {
@@ -69,11 +74,11 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
private final MongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
private final @Nullable String collection;
@Nullable private final String collection;
private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
@Nullable String collection, Query query) {
String collection, Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
@@ -81,6 +86,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithCollection#inCollection(java.lang.String)
*/
@Override
public FindWithProjection<T> inCollection(String collection) {
@@ -89,6 +98,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithProjection#as(Class)
*/
@Override
public <T1> FindWithQuery<T1> as(Class<T1> returnType) {
@@ -97,6 +110,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingFind<T> matching(Query query) {
@@ -105,6 +122,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new ExecutableFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#oneValue()
*/
@Override
public T oneValue() {
@@ -121,6 +142,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return result.iterator().next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#firstValue()
*/
@Override
public T firstValue() {
@@ -129,31 +154,55 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return ObjectUtils.isEmpty(result) ? null : result.iterator().next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#all()
*/
@Override
public List<T> all() {
return doFind(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#stream()
*/
@Override
public Stream<T> stream() {
return doStream();
return StreamUtils.createStreamFromIterator(doStream());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery#near(org.springframework.data.mongodb.core.query.NearQuery)
*/
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#count()
*/
@Override
public long count() {
return template.count(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind#exists()
*/
@Override
public boolean exists() {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindDistinct#distinct(java.lang.String)
*/
@SuppressWarnings("unchecked")
@Override
public TerminatingDistinct<Object> distinct(String field) {
@@ -178,7 +227,7 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
returnType == domainType ? (Class<T>) Object.class : returnType);
}
private Stream<T> doStream() {
private CloseableIterator<T> doStream() {
return template.doStream(query, domainType, getCollectionName(), returnType);
}
@@ -208,6 +257,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.CursorPreparer#prepare(com.mongodb.clientFindIterable)
*/
@Override
public FindIterable<Document> prepare(FindIterable<Document> iterable) {
@@ -242,6 +295,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
@@ -251,6 +308,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new DistinctOperationSupport<>((ExecutableFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingDistinct<T> matching(Query query) {
@@ -259,6 +320,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return new DistinctOperationSupport<>((ExecutableFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingDistinct#all()
*/
@Override
public List<T> all() {
return delegate.doFindDistinct(field);

View File

@@ -40,6 +40,10 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.coreExecutableInsertOperation#insert(java.lan.Class)
*/
@Override
public <T> ExecutableInsert<T> insert(Class<T> domainType) {
@@ -67,6 +71,10 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
this.bulkMode = bulkMode;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#insert(java.lang.Class)
*/
@Override
public T one(T object) {
@@ -75,6 +83,10 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
return template.insert(object, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingInsert#all(java.util.Collection)
*/
@Override
public Collection<T> all(Collection<? extends T> objects) {
@@ -83,6 +95,10 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
return template.insert(objects, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.TerminatingBulkInsert#bulk(java.util.Collection)
*/
@Override
public BulkWriteResult bulk(Collection<? extends T> objects) {
@@ -92,6 +108,10 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
.insert(new ArrayList<>(objects)).execute();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithCollection#inCollection(java.lang.String)
*/
@Override
public InsertWithBulkMode<T> inCollection(String collection) {
@@ -100,6 +120,10 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
return new ExecutableInsertSupport<>(template, domainType, collection, bulkMode);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation.InsertWithBulkMode#withBulkMode(org.springframework.data.mongodb.core.BulkMode)
*/
@Override
public TerminatingBulkInsert<T> withBulkMode(BulkMode bulkMode) {

View File

@@ -41,6 +41,10 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.tempate = tempate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation#remove(java.lang.Class)
*/
@Override
public <T> ExecutableRemove<T> remove(Class<T> domainType) {
@@ -67,6 +71,10 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithCollection#inCollection(java.lang.String)
*/
@Override
public RemoveWithQuery<T> inCollection(String collection) {
@@ -75,6 +83,10 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
return new ExecutableRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.RemoveWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingRemove<T> matching(Query query) {
@@ -83,16 +95,28 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
return new ExecutableRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#all()
*/
@Override
public DeleteResult all() {
return template.doRemove(getCollectionName(), query, domainType, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#one()
*/
@Override
public DeleteResult one() {
return template.doRemove(getCollectionName(), query, domainType, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#findAndRemove()
*/
@Override
public List<T> findAndRemove() {

View File

@@ -40,6 +40,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation#update(java.lang.Class)
*/
@Override
public <T> ExecutableUpdate<T> update(Class<T> domainType) {
@@ -81,6 +85,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#apply(org.springframework.data.mongodb.core.query.UpdateDefinition)
*/
@Override
public TerminatingUpdate<T> apply(UpdateDefinition update) {
@@ -90,6 +98,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithCollection#inCollection(java.lang.String)
*/
@Override
public UpdateWithQuery<T> inCollection(String collection) {
@@ -99,6 +111,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndModifyWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndModifyOptions)
*/
@Override
public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) {
@@ -108,6 +124,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.UpdateWithUpdate#replaceWith(Object)
*/
@Override
public FindAndReplaceWithProjection<T> replaceWith(T replacement) {
@@ -117,6 +137,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) {
@@ -126,6 +150,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
options, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public UpdateWithUpdate<T> matching(Query query) {
@@ -135,6 +163,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithProjection#as(java.lang.Class)
*/
@Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {
@@ -144,21 +176,37 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
findAndReplaceOptions, replacement, resultType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#all()
*/
@Override
public UpdateResult all() {
return doUpdate(true, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#first()
*/
@Override
public UpdateResult first() {
return doUpdate(false, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingUpdate#upsert()
*/
@Override
public UpdateResult upsert() {
return doUpdate(true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndModify#findAndModifyValue()
*/
@Override
public @Nullable T findAndModifyValue() {
@@ -167,6 +215,10 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndReplace#findAndReplaceValue()
*/
@Override
public @Nullable T findAndReplaceValue() {

View File

@@ -112,31 +112,55 @@ public class MappedDocument {
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getUpdateObject()
*/
@Override
public Document getUpdateObject() {
return delegate.getUpdateObject();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#modifies(java.lang.String)
*/
@Override
public boolean modifies(String key) {
return delegate.modifies(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#inc(java.lang.String)
*/
@Override
public void inc(String version) {
delegate.inc(version);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
*/
@Override
public Boolean isIsolated() {
return delegate.isIsolated();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override
public List<ArrayFilter> getArrayFilters() {
return delegate.getArrayFilters();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#hasArrayFilters()
*/
@Override
public boolean hasArrayFilters() {
return delegate.hasArrayFilters();

View File

@@ -24,6 +24,7 @@ import java.util.function.Predicate;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.PersistentProperty;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MongoConverter;
@@ -44,7 +45,6 @@ import org.springframework.data.util.ClassTypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.LinkedMultiValueMap;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
@@ -62,7 +62,6 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
private final MongoConverter converter;
private final MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final Predicate<JsonSchemaPropertyContext> filter;
private final LinkedMultiValueMap<String, Class<?>> mergeProperties;
/**
* Create a new instance of {@link MappingMongoJsonSchemaCreator}.
@@ -73,48 +72,29 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
MappingMongoJsonSchemaCreator(MongoConverter converter) {
this(converter, (MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty>) converter.getMappingContext(),
(property) -> true, new LinkedMultiValueMap<>());
(property) -> true);
}
@SuppressWarnings("unchecked")
MappingMongoJsonSchemaCreator(MongoConverter converter,
MappingContext<MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
Predicate<JsonSchemaPropertyContext> filter, LinkedMultiValueMap<String, Class<?>> mergeProperties) {
Predicate<JsonSchemaPropertyContext> filter) {
Assert.notNull(converter, "Converter must not be null!");
this.converter = converter;
this.mappingContext = mappingContext;
this.filter = filter;
this.mergeProperties = mergeProperties;
}
@Override
public MongoJsonSchemaCreator filter(Predicate<JsonSchemaPropertyContext> filter) {
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter, mergeProperties);
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter);
}
@Override
public PropertySpecifier property(String path) {
return types -> withTypesFor(path, types);
}
/**
* Specify additional types to be considered wehen rendering the schema for the given path.
*
* @param path path the path using {@literal dot '.'} notation.
* @param types must not be {@literal null}.
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.4
/*
* (non-Javadoc)
* org.springframework.data.mongodb.core.MongoJsonSchemaCreator#createSchemaFor(java.lang.Class)
*/
public MongoJsonSchemaCreator withTypesFor(String path, Class<?>... types) {
LinkedMultiValueMap<String, Class<?>> clone = mergeProperties.clone();
for (Class<?> type : types) {
clone.add(path, type);
}
return new MappingMongoJsonSchemaCreator(converter, mappingContext, filter, clone);
}
@Override
public MongoJsonSchema createSchemaFor(Class<?> type) {
@@ -155,12 +135,9 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
List<MongoPersistentProperty> currentPath = new ArrayList<>(path);
String stringPath = currentPath.stream().map(PersistentProperty::getName).collect(Collectors.joining("."));
stringPath = StringUtils.hasText(stringPath) ? (stringPath + "." + nested.getName()) : nested.getName();
if (!filter.test(new PropertyContext(stringPath, nested))) {
if (!mergeProperties.containsKey(stringPath)) {
continue;
}
if (!filter.test(new PropertyContext(
currentPath.stream().map(PersistentProperty::getName).collect(Collectors.joining(".")), nested))) {
continue;
}
if (path.contains(nested)) { // cycle guard
@@ -178,34 +155,14 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
private JsonSchemaProperty computeSchemaForProperty(List<MongoPersistentProperty> path) {
String stringPath = path.stream().map(MongoPersistentProperty::getName).collect(Collectors.joining("."));
MongoPersistentProperty property = CollectionUtils.lastElement(path);
boolean required = isRequiredProperty(property);
Class<?> rawTargetType = computeTargetType(property); // target type before conversion
Class<?> targetType = converter.getTypeMapper().getWriteTargetTypeFor(rawTargetType); // conversion target type
if (!isCollection(property) && ObjectUtils.nullSafeEquals(rawTargetType, targetType)) {
if (property.isEntity() || mergeProperties.containsKey(stringPath)) {
List<JsonSchemaProperty> targetProperties = new ArrayList<>();
if (property.isEntity()) {
targetProperties.add(createObjectSchemaPropertyForEntity(path, property, required));
}
if (mergeProperties.containsKey(stringPath)) {
for (Class<?> theType : mergeProperties.get(stringPath)) {
ObjectJsonSchemaProperty target = JsonSchemaProperty.object(property.getName());
List<JsonSchemaProperty> nestedProperties = computePropertiesForEntity(path,
mappingContext.getRequiredPersistentEntity(theType));
targetProperties.add(createPotentiallyRequiredSchemaProperty(
target.properties(nestedProperties.toArray(new JsonSchemaProperty[0])), required));
}
}
return targetProperties.size() == 1 ? targetProperties.iterator().next()
: JsonSchemaProperty.merged(targetProperties);
}
if (!isCollection(property) && property.isEntity() && ObjectUtils.nullSafeEquals(rawTargetType, targetType)) {
return createObjectSchemaPropertyForEntity(path, property, required);
}
String fieldName = computePropertyFieldName(property);

View File

@@ -46,16 +46,25 @@ public class MongoAdmin implements MongoAdminOperations {
this.mongoClient = client;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).drop();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale", 1024)).toJson();

View File

@@ -119,15 +119,27 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends MongoClient> getObjectType() {
return MongoClient.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClient createInstance() throws Exception {
return createMongoClient(computeClientSetting());
@@ -324,6 +336,10 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
return !fromConnectionStringIsDefault ? fromConnectionString : defaultValue;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(@Nullable MongoClient instance) throws Exception {
@@ -337,11 +353,6 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
}
private String getOrDefault(Object value, String defaultValue) {
if(value == null) {
return defaultValue;
}
String sValue = value.toString();
return StringUtils.hasText(sValue) ? sValue : defaultValue;
return !StringUtils.isEmpty(value) ? value.toString() : defaultValue;
}
}

View File

@@ -84,10 +84,18 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
public MongoDatabase getMongoDatabase() throws DataAccessException {
return getMongoDatabase(getDefaultDatabaseName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
@@ -110,16 +118,28 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
*/
protected abstract MongoDatabase doGetMongoDatabase(String dbName);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
closeClient();
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.Session)
*/
public MongoDatabaseFactory withSession(ClientSession session) {
return new MongoDatabaseFactorySupport.ClientSessionBoundMongoDbFactory(session, this);
}
@@ -160,31 +180,55 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase()
*/
@Override
public MongoDatabase getMongoDatabase() throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getMongoDatabase(java.lang.String)
*/
@Override
public MongoDatabase getMongoDatabase(String dbName) throws DataAccessException {
return proxyMongoDatabase(delegate.getMongoDatabase(dbName));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return delegate.getSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public MongoDatabaseFactory withSession(ClientSession session) {
return delegate.withSession(session);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#isTransactionActive()
*/
@Override
public boolean isTransactionActive() {
return session != null && session.hasActiveTransaction();

View File

@@ -0,0 +1,50 @@
/*
* Copyright 2018-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.support.PersistenceExceptionTranslator;
/**
* Common base class for usage with both {@link com.mongodb.client.MongoClients} defining common properties such as
* database name and exception translator.
* <br />
* Not intended to be used directly.
*
* @author Christoph Strobl
* @author Mark Paluch
* @param <C> Client type.
* @since 2.1
* @see SimpleMongoClientDatabaseFactory
* @deprecated since 3.0, use {@link MongoDatabaseFactorySupport} instead.
*/
@Deprecated
public abstract class MongoDbFactorySupport<C> extends MongoDatabaseFactorySupport<C> {
/**
* Create a new {@link MongoDbFactorySupport} object given {@code mongoClient}, {@code databaseName},
* {@code mongoInstanceCreated} and {@link PersistenceExceptionTranslator}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated {@literal true} if the client instance was created by a subclass of
* {@link MongoDbFactorySupport} to close the client on {@link #destroy()}.
* @param exceptionTranslator must not be {@literal null}.
*/
protected MongoDbFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated,
PersistenceExceptionTranslator exceptionTranslator) {
super(mongoClient, databaseName, mongoInstanceCreated, exceptionTranslator);
}
}

View File

@@ -88,6 +88,10 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
this.schemaMap = schemaMap;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public AutoEncryptionSettings getObject() {
@@ -105,6 +109,10 @@ public class MongoEncryptionSettingsFactoryBean implements FactoryBean<AutoEncry
return source != null ? source : Collections.emptyMap();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return AutoEncryptionSettings.class;

View File

@@ -68,6 +68,10 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {

View File

@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import java.util.function.Predicate;
@@ -63,6 +62,7 @@ import org.springframework.util.Assert;
* {@link org.springframework.data.annotation.Id _id} properties using types that can be converted into
* {@link org.bson.types.ObjectId} like {@link String} will be mapped to {@code type : 'object'} unless there is more
* specific information available via the {@link org.springframework.data.mongodb.core.mapping.MongoId} annotation.
* {@link Encrypted} properties will contain {@literal encrypt} information.
*
* @author Christoph Strobl
@@ -78,20 +78,6 @@ public interface MongoJsonSchemaCreator {
*/
MongoJsonSchema createSchemaFor(Class<?> type);
/**
* Create a merged {@link MongoJsonSchema} out of the individual schemas of the given types by merging their
* properties into one large {@link MongoJsonSchema schema}.
*
* @param types must not be {@literal null} nor contain {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
default MongoJsonSchema mergedSchemaFor(Class<?>... types) {
MongoJsonSchema[] schemas = Arrays.stream(types).map(this::createSchemaFor).toArray(MongoJsonSchema[]::new);
return MongoJsonSchema.merge(schemas);
}
/**
* Filter matching {@link JsonSchemaProperty properties}.
*
@@ -101,39 +87,30 @@ public interface MongoJsonSchemaCreator {
*/
MongoJsonSchemaCreator filter(Predicate<JsonSchemaPropertyContext> filter);
/**
* Entry point to specify additional behavior for a given path.
*
* @param path the path using {@literal dot '.'} notation.
* @return new instance of {@link PropertySpecifier}.
* @since 3.4
*/
PropertySpecifier property(String path);
/**
* The context in which a specific {@link #getProperty()} is encountered during schema creation.
*
*
* @since 3.3
*/
interface JsonSchemaPropertyContext {
/**
* The path to a given field/property in dot notation.
*
*
* @return never {@literal null}.
*/
String getPath();
/**
* The current property.
*
*
* @return never {@literal null}.
*/
MongoPersistentProperty getProperty();
/**
* Obtain the {@link MongoPersistentEntity} for a given property.
*
*
* @param property must not be {@literal null}.
* @param <T>
* @return {@literal null} if the property is not an entity. It is nevertheless recommend to check
@@ -232,19 +209,4 @@ public interface MongoJsonSchemaCreator {
return create(converter);
}
/**
* @author Christoph Strobl
* @since 3.4
*/
interface PropertySpecifier {
/**
* Set additional type parameters for polymorphic ones.
*
* @param types must not be {@literal null}.
* @return the source
*/
MongoJsonSchemaCreator withTypes(Class<?>... types);
}
}

View File

@@ -20,7 +20,6 @@ import java.util.List;
import java.util.Set;
import java.util.function.Consumer;
import java.util.function.Supplier;
import java.util.stream.Stream;
import org.bson.Document;
import org.springframework.data.geo.GeoResults;
@@ -33,6 +32,8 @@ import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.BasicQuery;
@@ -41,6 +42,7 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.util.CloseableIterator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -55,7 +57,8 @@ import com.mongodb.client.result.UpdateResult;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
* proxy). <br />
* proxy).
* <br />
* <strong>NOTE:</strong> Some operations cannot be executed within a MongoDB transaction. Please refer to the MongoDB
* specific documentation to learn more about <a href="https://docs.mongodb.com/manual/core/transactions/">Multi
* Document Transactions</a>.
@@ -77,11 +80,12 @@ public interface MongoOperations extends FluentMongoOperations {
*
* @param entityClass must not be {@literal null}.
* @return never {@literal null}.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be derived from the type.
*/
String getCollectionName(Class<?> entityClass);
/**
* Execute a MongoDB command expressed as a JSON string. Parsing is delegated to {@link Document#parse(String)} to
* Execute the a MongoDB command expressed as a JSON string. Parsing is delegated to {@link Document#parse(String)} to
* obtain the {@link Document} holding the actual command. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
@@ -121,7 +125,8 @@ public interface MongoOperations extends FluentMongoOperations {
void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch);
/**
* Executes a {@link DbCallback} translating any exceptions as necessary. <br />
* Executes a {@link DbCallback} translating any exceptions as necessary.
* <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance. Must not
@@ -133,7 +138,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T execute(DbCallback<T> action);
/**
* Executes the given {@link CollectionCallback} on the entity collection of the specified class. <br />
* Executes the given {@link CollectionCallback} on the entity collection of the specified class.
* <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
@@ -145,7 +151,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T execute(Class<?> entityClass, CollectionCallback<T> action);
/**
* Executes the given {@link CollectionCallback} on the collection of the given name. <br />
* Executes the given {@link CollectionCallback} on the collection of the given name.
* <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param collectionName the name of the collection that specifies which {@link MongoCollection} instance will be
@@ -169,7 +176,8 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding the {@link ClientSession}
* provided by the given {@link Supplier} to each and every command issued against MongoDB. <br />
* provided by the given {@link Supplier} to each and every command issued against MongoDB.
* <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use the
* {@link SessionScoped#execute(SessionCallback, Consumer)} hook to potentially close the {@link ClientSession}.
*
@@ -204,7 +212,8 @@ public interface MongoOperations extends FluentMongoOperations {
}
/**
* Obtain a {@link ClientSession} bound instance of {@link MongoOperations}. <br />
* Obtain a {@link ClientSession} bound instance of {@link MongoOperations}.
* <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle.
*
* @param session must not be {@literal null}.
@@ -217,34 +226,34 @@ public interface MongoOperations extends FluentMongoOperations {
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link com.mongodb.client.FindIterable}.
* <p>
* Returns a {@link String} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to
* be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param <T> element return type
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return will never be {@literal null}.
* @since 1.7
*/
<T> Stream<T> stream(Query query, Class<T> entityType);
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link com.mongodb.client.FindIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.FindIterable} that needs to be closed.
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.FindIterable} that needs to
* be closed.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification. Must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @param <T> element return type
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return will never be {@literal null}.
* @since 1.10
*/
<T> Stream<T> stream(Query query, Class<T> entityType, String collectionName);
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
/**
* Create an uncapped collection with a name based on the provided entity class.
@@ -291,7 +300,8 @@ public interface MongoOperations extends FluentMongoOperations {
* Get a {@link MongoCollection} by its name. The returned collection may not exists yet (except in local memory) and
* is created on first interaction with the server. Collections can be explicitly created via
* {@link #createCollection(Class)}. Please make sure to check if the collection {@link #collectionExists(Class)
* exists} first. <br />
* exists} first.
* <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
@@ -300,7 +310,8 @@ public interface MongoOperations extends FluentMongoOperations {
MongoCollection<Document> getCollection(String collectionName);
/**
* Check to see if a collection with a name indicated by the entity class exists. <br />
* Check to see if a collection with a name indicated by the entity class exists.
* <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the name of the collection. Must not be {@literal null}.
@@ -309,7 +320,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> boolean collectionExists(Class<T> entityClass);
/**
* Check to see if a collection with a given name exists. <br />
* Check to see if a collection with a given name exists.
* <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
@@ -318,7 +330,8 @@ public interface MongoOperations extends FluentMongoOperations {
boolean collectionExists(String collectionName);
/**
* Drop the collection with the name indicated by the entity class. <br />
* Drop the collection with the name indicated by the entity class.
* <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the collection to drop/delete. Must not be {@literal null}.
@@ -326,7 +339,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> void dropCollection(Class<T> entityClass);
/**
* Drop the collection with the given name. <br />
* Drop the collection with the given name.
* <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection to drop/delete.
@@ -389,9 +403,11 @@ public interface MongoOperations extends FluentMongoOperations {
BulkOperations bulkOps(BulkMode mode, @Nullable Class<?> entityType, String collectionName);
/**
* Query for a list of objects of type T from the collection used by the entity class. <br />
* Query for a list of objects of type T from the collection used by the entity class.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -401,9 +417,11 @@ public interface MongoOperations extends FluentMongoOperations {
<T> List<T> findAll(Class<T> entityClass);
/**
* Query for a list of objects of type T from the specified collection. <br />
* Query for a list of objects of type T from the specified collection.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -413,6 +431,43 @@ public interface MongoOperations extends FluentMongoOperations {
*/
<T> List<T> findAll(Class<T> entityClass, String collectionName);
/**
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parametrized type of the returned list
* @return The results of the group operation
* @deprecated since 2.2. The {@code group} command has been removed in MongoDB Server 4.2.0. <br />
* Please use {@link #aggregate(TypedAggregation, String, Class) } with a
* {@link org.springframework.data.mongodb.core.aggregation.GroupOperation} instead.
*/
@Deprecated
<T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass);
/**
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
* @param groupBy the conditions under which the group operation will be performed, e.g. keys, initial document,
* reduce function.
* @param entityClass The parametrized type of the returned list
* @return The results of the group operation
* @deprecated since 2.2. The {@code group} command has been removed in MongoDB Server 4.2.0. <br />
* Please use {@link #aggregate(TypedAggregation, String, Class) } with a
* {@link org.springframework.data.mongodb.core.aggregation.GroupOperation} and
* {@link org.springframework.data.mongodb.core.aggregation.MatchOperation} instead.
*/
@Deprecated
<T> GroupByResults<T> group(@Nullable Criteria criteria, String inputCollectionName, GroupBy groupBy,
Class<T> entityClass);
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
@@ -467,9 +522,9 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class. The name of the inputCollection is derived from
* the inputType of the aggregation.
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class. The name of the inputCollection is
* derived from the inputType of the aggregation.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
@@ -478,37 +533,35 @@ public interface MongoOperations extends FluentMongoOperations {
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> Stream<O> aggregateStream(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class and are returned as stream. The name of the
* inputCollection is derived from the inputType of the aggregation.
* <p>
* <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class and are returned as stream. The name
* of the inputCollection is derived from the inputType of the aggregation.
* <br />
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> Stream<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType);
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class.
* <p>
* <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class.
* <br />
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
@@ -517,18 +570,17 @@ public interface MongoOperations extends FluentMongoOperations {
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> Stream<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link com.mongodb.client.AggregateIterable}.
* <p>
* Returns a {@link Stream} that wraps the Mongo DB {@link com.mongodb.client.AggregateIterable} that needs to be
* closed. The raw results will be mapped to the given entity class.
* <p>
* <br />
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link com.mongodb.client.AggregateIterable} that
* needs to be closed. The raw results will be mapped to the given entity class.
* <br />
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
@@ -537,11 +589,10 @@ public interface MongoOperations extends FluentMongoOperations {
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parametrized type of the returned list, must not be {@literal null}.
* @return the result {@link Stream}, containing mapped objects, needing to be closed once fully processed (e.g.
* through a try-with-resources clause).
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> Stream<O> aggregateStream(Aggregation aggregation, String collectionName, Class<O> outputType);
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, String collectionName, Class<O> outputType);
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
@@ -551,9 +602,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param reduceFunction The JavaScript reduce function
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
@@ -566,9 +615,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param mapReduceOptions Options that specify detailed map-reduce behavior.
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
@Nullable MapReduceOptions mapReduceOptions, Class<T> entityClass);
@@ -582,9 +629,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param reduceFunction The JavaScript reduce function
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass);
@@ -598,9 +643,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param mapReduceOptions Options that specify detailed map-reduce behavior
* @param entityClass The parametrized type of the returned list. Must not be {@literal null}.
* @return The results of the map reduce operation
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
@Nullable MapReduceOptions mapReduceOptions, Class<T> entityClass);
@@ -659,9 +702,11 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type. <br />
* specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -675,9 +720,11 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object of the specified
* type. <br />
* type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -721,9 +768,11 @@ public interface MongoOperations extends FluentMongoOperations {
boolean exists(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type. <br />
* Map the results of an ad-hoc query on the collection for the entity class to a List of the specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -735,9 +784,11 @@ public interface MongoOperations extends FluentMongoOperations {
<T> List<T> find(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a List of the specified type. <br />
* Map the results of an ad-hoc query on the specified collection to a List of the specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -918,6 +969,8 @@ public interface MongoOperations extends FluentMongoOperations {
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the converted object that was updated or {@literal null}, if not found.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 2.1
*/
@Nullable
@@ -959,6 +1012,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @return the converted object that was updated or {@literal null}, if not found. Depending on the value of
* {@link FindAndReplaceOptions#isReturnNew()} this will either be the object as it was before the update or
* as it is after the update.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 2.1
*/
@Nullable
@@ -1032,6 +1087,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @return the converted object that was updated or {@literal null}, if not found. Depending on the value of
* {@link FindAndReplaceOptions#isReturnNew()} this will either be the object as it was before the update or
* as it is after the update.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 2.1
*/
@Nullable
@@ -1069,8 +1126,10 @@ public interface MongoOperations extends FluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
* database. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1087,7 +1146,8 @@ public interface MongoOperations extends FluentMongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1106,96 +1166,8 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @see #exactCount(Query, Class)
* @see #estimatedCount(Class)
*/
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @see #exactCount(Query, String)
* @see #estimatedCount(String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @see #estimatedCount(String)
*/
long count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return the estimated number of documents.
* @since 3.1
*/
default long estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return the estimated number of documents.
* @since 3.1
*/
long estimatedCount(String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
* count all matches.
* <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
@@ -1206,11 +1178,10 @@ public interface MongoOperations extends FluentMongoOperations {
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @since 3.4
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
default long exactCount(Query query, Class<?> entityClass) {
return exactCount(query, entityClass, getCollectionName(entityClass));
}
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
@@ -1219,7 +1190,8 @@ public interface MongoOperations extends FluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* count all matches.
* <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
@@ -1230,19 +1202,48 @@ public interface MongoOperations extends FluentMongoOperations {
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @since 3.4
*/
default long exactCount(Query query, String collectionName) {
return exactCount(query, null, collectionName);
long count(Query query, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return the estimated number of documents.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @since 3.1
*/
default long estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return the estimated number of documents.
* @since 3.1
*/
long estimatedCount(String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* count all matches.
* <br />
* This method uses an
* {@link com.mongodb.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
@@ -1254,18 +1255,20 @@ public interface MongoOperations extends FluentMongoOperations {
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @since 3.4
*/
long exactCount(Query query, @Nullable Class<?> entityClass, String collectionName);
long count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details. <br />
* Type Conversion"</a> for more details.
* <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1273,13 +1276,17 @@ public interface MongoOperations extends FluentMongoOperations {
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return the inserted object.
* @throws IllegalArgumentException in case the {@code objectToSave} is collection-like.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
<T> T insert(T objectToSave);
/**
* Insert the object into the specified collection. <br />
* Insert the object into the specified collection.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1297,6 +1304,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @param batchToSave the batch of objects to save. Must not be {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the inserted objects that.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
<T> Collection<T> insert(Collection<? extends T> batchToSave, Class<?> entityClass);
@@ -1315,37 +1324,46 @@ public interface MongoOperations extends FluentMongoOperations {
*
* @param objectsToSave the list of objects to save. Must not be {@literal null}.
* @return the inserted objects.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} for the given objects.
*/
<T> Collection<T> insertAll(Collection<? extends T> objectsToSave);
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'. <br />
* object is not already present, that is an 'upsert'.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details. <br />
* Type Conversion"</a> for more details.
* <br />
* The {@code objectToSave} must not be collection-like.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return the saved object.
* @throws IllegalArgumentException in case the {@code objectToSave} is collection-like.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
<T> T save(T objectToSave);
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'. <br />
* is an 'upsert'.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details. <br />
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
* <br />
* The {@code objectToSave} must not be collection-like.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1367,9 +1385,11 @@ public interface MongoOperations extends FluentMongoOperations {
* the existing object. Must not be {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
* @since 3.0
* @see Update
* @see AggregationUpdate
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @since 3.0
*/
UpdateResult upsert(Query query, UpdateDefinition update, Class<?> entityClass);
@@ -1421,9 +1441,11 @@ public interface MongoOperations extends FluentMongoOperations {
* the existing. Must not be {@literal null}.
* @param entityClass class that determines the collection to use.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
* @since 3.0
* @see Update
* @see AggregationUpdate
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @since 3.0
*/
UpdateResult updateFirst(Query query, UpdateDefinition update, Class<?> entityClass);
@@ -1475,9 +1497,11 @@ public interface MongoOperations extends FluentMongoOperations {
* the existing. Must not be {@literal null}.
* @param entityClass class of the pojo to be operated on. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
* @since 3.0
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @see Update
* @see AggregationUpdate
* @since 3.0
*/
UpdateResult updateMulti(Query query, UpdateDefinition update, Class<?> entityClass);
@@ -1525,6 +1549,8 @@ public interface MongoOperations extends FluentMongoOperations {
*
* @param object must not be {@literal null}.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
DeleteResult remove(Object object);
@@ -1548,6 +1574,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @param entityClass class that determines the collection to use.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query} or {@literal entityClass} is {@literal null}.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
DeleteResult remove(Query query, Class<?> entityClass);
@@ -1595,6 +1623,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass class of the pojo to be operated on.
* @return the {@link List} converted objects deleted by this operation.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @since 1.5
*/
<T> List<T> findAllAndRemove(Query query, Class<T> entityClass);

View File

@@ -16,19 +16,18 @@
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.data.mapping.SimplePropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.PersistentPropertyTranslator;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.util.Predicates;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.ProjectionInformation;
import org.springframework.util.ClassUtils;
/**
* Common operations performed on properties of an entity like extracting fields information for projection creation.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
class PropertyOperations {
@@ -41,37 +40,37 @@ class PropertyOperations {
/**
* For cases where {@code fields} is {@link Document#isEmpty() empty} include only fields that are required for
* creating the projection (target) type if the {@code EntityProjection} is a {@literal DTO projection} or a
* creating the projection (target) type if the {@code targetType} is a {@literal DTO projection} or a
* {@literal closed interface projection}.
*
* @param projection must not be {@literal null}.
* @param projectionFactory must not be {@literal null}.
* @param fields must not be {@literal null}.
* @param domainType must not be {@literal null}.
* @param targetType must not be {@literal null}.
* @return {@link Document} with fields to be included.
*/
Document computeMappedFieldsForProjection(EntityProjection<?, ?> projection,
Document fields) {
Document computeFieldsForProjection(ProjectionFactory projectionFactory, Document fields, Class<?> domainType,
Class<?> targetType) {
if (!projection.isClosedProjection()) {
if (!fields.isEmpty() || ClassUtils.isAssignable(domainType, targetType)) {
return fields;
}
Document projectedFields = new Document();
if (projection.getMappedType().getType().isInterface()) {
projection.forEach(it -> {
projectedFields.put(it.getPropertyPath().getSegment(), 1);
});
if (targetType.isInterface()) {
ProjectionInformation projectionInformation = projectionFactory.getProjectionInformation(targetType);
if (projectionInformation.isClosed()) {
projectionInformation.getInputProperties().forEach(it -> projectedFields.append(it.getName(), 1));
}
} else {
// DTO projections use merged metadata between domain type and result type
PersistentPropertyTranslator translator = PersistentPropertyTranslator.create(
mappingContext.getRequiredPersistentEntity(projection.getDomainType()),
Predicates.negate(MongoPersistentProperty::hasExplicitFieldName));
MongoPersistentEntity<?> persistentEntity = mappingContext
.getRequiredPersistentEntity(projection.getMappedType());
for (MongoPersistentProperty property : persistentEntity) {
projectedFields.put(translator.translate(property).getFieldName(), 1);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(targetType);
if (entity != null) {
entity.doWithProperties(
(SimplePropertyHandler) persistentProperty -> projectedFields.append(persistentProperty.getName(), 1));
}
}

View File

@@ -28,7 +28,6 @@ import java.util.stream.Collectors;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.codecs.Codec;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
@@ -55,10 +54,11 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.projection.EntityProjection;
import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.util.Lazy;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.client.model.CountOptions;
@@ -288,59 +288,45 @@ class QueryOperations {
return queryMapper.getMappedObject(getQueryObject(), entity);
}
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity,
EntityProjection<?, ?> projection) {
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity, Class<?> targetType,
ProjectionFactory projectionFactory) {
Document fields = evaluateFields(entity);
Document fields = new Document();
if (entity == null) {
return fields;
}
Document mappedFields;
if (!fields.isEmpty()) {
mappedFields = queryMapper.getMappedFields(fields, entity);
} else {
mappedFields = propertyOperations.computeMappedFieldsForProjection(projection, fields);
mappedFields = queryMapper.addMetaAttributes(mappedFields, entity);
}
if (entity.hasTextScoreProperty() && mappedFields.containsKey(entity.getTextScoreProperty().getFieldName())
&& !query.getQueryObject().containsKey("$text")) {
mappedFields.remove(entity.getTextScoreProperty().getFieldName());
}
if (mappedFields.isEmpty()) {
return BsonUtils.EMPTY_DOCUMENT;
}
return mappedFields;
}
private Document evaluateFields(@Nullable MongoPersistentEntity<?> entity) {
Document fields = query.getFieldsObject();
if (fields.isEmpty()) {
return BsonUtils.EMPTY_DOCUMENT;
}
Document evaluated = new Document();
for (Entry<String, Object> entry : fields.entrySet()) {
for (Entry<String, Object> entry : query.getFieldsObject().entrySet()) {
if (entry.getValue() instanceof MongoExpression) {
AggregationOperationContext ctx = entity == null ? Aggregation.DEFAULT_CONTEXT
: new RelaxedTypeBasedAggregationOperationContext(entity.getType(), mappingContext, queryMapper);
evaluated.put(entry.getKey(), AggregationExpression.from((MongoExpression) entry.getValue()).toDocument(ctx));
fields.put(entry.getKey(), AggregationExpression.from((MongoExpression) entry.getValue()).toDocument(ctx));
} else {
evaluated.put(entry.getKey(), entry.getValue());
fields.put(entry.getKey(), entry.getValue());
}
}
return evaluated;
Document mappedFields = fields;
if (entity == null) {
return mappedFields;
}
Document projectedFields = propertyOperations.computeFieldsForProjection(projectionFactory, fields,
entity.getType(), targetType);
if (ObjectUtils.nullSafeEquals(fields, projectedFields)) {
mappedFields = queryMapper.getMappedFields(projectedFields, entity);
} else {
mappedFields = queryMapper.getMappedFields(projectedFields,
mappingContext.getRequiredPersistentEntity(targetType));
}
if (entity.hasTextScoreProperty() && !query.getQueryObject().containsKey("$text")) {
mappedFields.remove(entity.getTextScoreProperty().getFieldName());
}
return mappedFields;
}
/**
@@ -402,8 +388,8 @@ class QueryOperations {
}
@Override
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity,
EntityProjection<?, ?> projection) {
Document getMappedFields(@Nullable MongoPersistentEntity<?> entity, Class<?> targetType,
ProjectionFactory projectionFactory) {
return getMappedFields(entity);
}

View File

@@ -46,6 +46,10 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override
public <T> ReactiveAggregation<T> aggregateAndReturn(Class<T> domainType) {
@@ -71,6 +75,10 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.AggregationOperationWithCollection#inCollection(java.lang.String)
*/
@Override
public AggregationOperationWithAggregation<T> inCollection(String collection) {
@@ -79,6 +87,10 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
return new ReactiveAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.AggregationOperationWithAggregation#by(org.springframework.data.mongodb.core.Aggregation)
*/
@Override
public TerminatingAggregationOperation<T> by(Aggregation aggregation) {
@@ -87,6 +99,10 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
return new ReactiveAggregationSupport<>(template, domainType, aggregation, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveAggregationOperation.TerminatingAggregationOperation#all()
*/
@Override
public Flux<T> all() {
return template.aggregate(aggregation, getCollectionName(aggregation), domainType);

View File

@@ -46,6 +46,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation#changeStream(java.lang.Class)
*/
@Override
public <T> ReactiveChangeStream<T> changeStream(Class<T> domainType) {
@@ -72,6 +76,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithCollection#watchCollection(java.lang.String)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> watchCollection(String collection) {
@@ -80,6 +88,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return new ReactiveChangeStreamSupport<>(template, domainType, returnType, collection, options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithCollection#watchCollection(java.lang.Class)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> watchCollection(Class<?> entityClass) {
@@ -88,6 +100,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return watchCollection(template.getCollectionName(entityClass));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#resumeAt(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> resumeAt(Object token) {
@@ -101,6 +117,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#resumeAfter(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> resumeAfter(Object token) {
@@ -109,6 +129,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return withOptions(builder -> builder.resumeAfter((BsonValue) token));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ResumingChangeStream#startAfter(java.lang.Object)
*/
@Override
public TerminatingChangeStream<T> startAfter(Object token) {
@@ -117,6 +141,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return withOptions(builder -> builder.startAfter((BsonValue) token));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithOptions#withOptions(java.util.function.Consumer)
*/
@Override
public ReactiveChangeStreamSupport<T> withOptions(Consumer<ChangeStreamOptionsBuilder> optionsConsumer) {
@@ -126,6 +154,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return new ReactiveChangeStreamSupport<>(template, domainType, returnType, collection, builder.build());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithProjection#as(java.lang.Class)
*/
@Override
public <R> ChangeStreamWithFilterAndProjection<R> as(Class<R> resultType) {
@@ -134,11 +166,19 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return new ReactiveChangeStreamSupport<>(template, domainType, resultType, collection, options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithFilter#filter(org.springframework.data.mongodb.core.aggregation.Aggregation)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> filter(Aggregation filter) {
return withOptions(builder -> builder.filter(filter));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.ChangeStreamWithFilter#filter(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public ChangeStreamWithFilterAndProjection<T> filter(CriteriaDefinition by) {
@@ -148,6 +188,10 @@ class ReactiveChangeStreamOperationSupport implements ReactiveChangeStreamOperat
return filter(aggregation);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveChangeStreamOperation.TerminatingChangeStream#listen()
*/
@Override
public Flux<ChangeStreamEvent<T>> listen() {
return template.changeStream(collection, options != null ? options : ChangeStreamOptions.empty(), returnType);

View File

@@ -44,6 +44,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation#query(java.lang.Class)
*/
@Override
public <T> ReactiveFind<T> query(Class<T> domainType) {
@@ -77,6 +81,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.query = query;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithCollection#inCollection(java.lang.String)
*/
@Override
public FindWithProjection<T> inCollection(String collection) {
@@ -85,6 +93,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithProjection#as(java.lang.Class)
*/
@Override
public <T1> FindWithQuery<T1> as(Class<T1> returnType) {
@@ -93,6 +105,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingFind<T> matching(Query query) {
@@ -101,6 +117,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new ReactiveFindSupport<>(template, domainType, returnType, collection, query);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#first()
*/
@Override
public Mono<T> first() {
@@ -110,6 +130,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return result.next();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#one()
*/
@Override
public Mono<T> one() {
@@ -131,31 +155,55 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#all()
*/
@Override
public Flux<T> all() {
return doFind(null);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#tail()
*/
@Override
public Flux<T> tail() {
return doFind(template.new TailingQueryFindPublisherPreparer(query, domainType));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery#near(org.springframework.data.mongodb.core.query.NearQuery)
*/
@Override
public TerminatingFindNear<T> near(NearQuery nearQuery) {
return () -> template.geoNear(nearQuery, domainType, getCollectionName(), returnType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#count()
*/
@Override
public Mono<Long> count() {
return template.count(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.TerminatingFind#exists()
*/
@Override
public Mono<Boolean> exists() {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindDistinct#distinct(java.lang.String)
*/
@Override
public TerminatingDistinct<Object> distinct(String field) {
@@ -207,6 +255,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
@@ -215,6 +267,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new DistinctOperationSupport<>((ReactiveFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
@SuppressWarnings("unchecked")
public TerminatingDistinct<T> matching(Query query) {
@@ -224,6 +280,10 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return new DistinctOperationSupport<>((ReactiveFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core..ReactiveFindOperation.TerminatingDistinct#all()
*/
@Override
public Flux<T> all() {
return delegate.doFindDistinct(field);

View File

@@ -38,6 +38,10 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation#insert(java.lang.Class)
*/
@Override
public <T> ReactiveInsert<T> insert(Class<T> domainType) {
@@ -59,6 +63,10 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.TerminatingInsert#one(java.lang.Object)
*/
@Override
public Mono<T> one(T object) {
@@ -67,6 +75,10 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
return template.insert(object, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.TerminatingInsert#all(java.util.Collection)
*/
@Override
public Flux<T> all(Collection<? extends T> objects) {
@@ -75,6 +87,10 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
return template.insert(objects, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveInsertOperation.InsertWithCollection#inCollection(java.lang.String)
*/
@Override
public ReactiveInsert<T> inCollection(String collection) {

View File

@@ -0,0 +1,31 @@
/*
* Copyright 2016-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.MongoClientSettings;
/**
* A factory bean for construction of a {@link MongoClientSettings} instance to be used with the async MongoDB driver.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
* @deprecated since 3.0 - Use {@link MongoClientSettingsFactoryBean} instead.
*/
@Deprecated
public class ReactiveMongoClientSettingsFactoryBean extends MongoClientSettingsFactoryBean {
}

View File

@@ -42,6 +42,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
import org.springframework.transaction.reactive.TransactionalOperator;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -57,7 +58,8 @@ import com.mongodb.reactivestreams.client.MongoCollection;
* <p>
* Implemented by {@link ReactiveMongoTemplate}. Not often used but a useful option for extensibility and testability
* (as it can be easily mocked, stubbed, or be the target of a JDK proxy). Command execution using
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}. <br />
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}.
* <br />
* <strong>NOTE:</strong> Some operations cannot be executed within a MongoDB transaction. Please refer to the MongoDB
* specific documentation to learn more about <a href="https://docs.mongodb.com/manual/core/transactions/">Multi
* Document Transactions</a>.
@@ -89,7 +91,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
ReactiveIndexOperations indexOps(Class<?> entityClass);
/**
* Execute a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a Document. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
@@ -118,7 +120,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Document> executeCommand(Document command, @Nullable ReadPreference readPreference);
/**
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary. <br />
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary.
* <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance. Must not
@@ -129,7 +132,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> execute(ReactiveDatabaseCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the entity collection of the specified class. <br />
* Executes the given {@link ReactiveCollectionCallback} on the entity collection of the specified class.
* <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
@@ -140,7 +144,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> execute(Class<?> entityClass, ReactiveCollectionCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the collection of the given name. <br />
* Executes the given {@link ReactiveCollectionCallback} on the collection of the given name.
* <br />
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param collectionName the name of the collection that specifies which {@link MongoCollection} instance will be
@@ -153,7 +158,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding the {@link ClientSession}
* provided by the given {@link Supplier} to each and every command issued against MongoDB. <br />
* provided by the given {@link Supplier} to each and every command issued against MongoDB.
* <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
@@ -171,7 +177,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding a new {@link ClientSession}
* with given {@literal sessionOptions} to each and every command issued against MongoDB. <br />
* with given {@literal sessionOptions} to each and every command issued against MongoDB.
* <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
@@ -197,14 +204,48 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
ReactiveSessionScoped withSession(Publisher<ClientSession> sessionProvider);
/**
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoOperations}. <br />
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoOperations}.
* <br />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle.
*
* @param session must not be {@literal null}.
* @return {@link ClientSession} bound instance of {@link ReactiveMongoOperations}.
* @since 2.1
*/
ReactiveMongoOperations withSession(ClientSession session);
/**
* Initiate a new {@link ClientSession} and obtain a {@link ClientSession session} bound instance of
* {@link ReactiveSessionScoped}. Starts the transaction and adds the {@link ClientSession} to each and every command
* issued against MongoDB.
* <br />
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @deprecated since 2.2. Use {@code @Transactional} or {@link TransactionalOperator}.
*/
@Deprecated
ReactiveSessionScoped inTransaction();
/**
* Obtain a {@link ClientSession session} bound instance of {@link ReactiveSessionScoped}, start the transaction and
* bind the {@link ClientSession} provided by the given {@link Publisher} to each and every command issued against
* MongoDB.
* <br />
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @param sessionProvider must not be {@literal null}.
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @since 2.1
* @deprecated since 2.2. Use {@code @Transactional} or {@link TransactionalOperator}.
*/
@Deprecated
ReactiveSessionScoped inTransaction(Publisher<ClientSession> sessionProvider);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -251,7 +292,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Get a {@link MongoCollection} by name. The returned collection may not exists yet (except in local memory) and is
* created on first interaction with the server. Collections can be explicitly created via
* {@link #createCollection(Class)}. Please make sure to check if the collection {@link #collectionExists(Class)
* exists} first. <br />
* exists} first.
* <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection.
@@ -260,7 +302,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<MongoCollection<Document>> getCollection(String collectionName);
/**
* Check to see if a collection with a name indicated by the entity class exists. <br />
* Check to see if a collection with a name indicated by the entity class exists.
* <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the name of the collection. Must not be {@literal null}.
@@ -269,7 +312,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<Boolean> collectionExists(Class<T> entityClass);
/**
* Check to see if a collection with a given name exists. <br />
* Check to see if a collection with a given name exists.
* <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection. Must not be {@literal null}.
@@ -278,7 +322,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Boolean> collectionExists(String collectionName);
/**
* Drop the collection with the name indicated by the entity class. <br />
* Drop the collection with the name indicated by the entity class.
* <br />
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the collection to drop/delete. Must not be {@literal null}.
@@ -286,7 +331,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Mono<Void> dropCollection(Class<T> entityClass);
/**
* Drop the collection with the given name. <br />
* Drop the collection with the given name.
* <br />
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection to drop/delete.
@@ -294,9 +340,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
Mono<Void> dropCollection(String collectionName);
/**
* Query for a {@link Flux} of objects of type T from the collection used by the entity class. <br />
* Query for a {@link Flux} of objects of type T from the collection used by the entity class.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -306,9 +354,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> findAll(Class<T> entityClass);
/**
* Query for a {@link Flux} of objects of type T from the specified collection. <br />
* Query for a {@link Flux} of objects of type T from the specified collection.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
@@ -320,9 +370,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type. <br />
* specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -335,9 +387,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object of the specified
* type. <br />
* type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -383,7 +437,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a {@link Flux} of the specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -395,9 +450,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<T> find(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a {@link Flux} of the specified type. <br />
* Map the results of an ad-hoc query on the specified collection to a {@link Flux} of the specified type.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -508,9 +565,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation. <br />
* Execute an aggregation operation.
* <br />
* The raw results will be mapped to the given entity class and are returned as stream. The name of the
* inputCollection is derived from the {@link TypedAggregation#getInputType() aggregation input type}. <br />
* inputCollection is derived from the {@link TypedAggregation#getInputType() aggregation input type}.
* <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -524,9 +583,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation. <br />
* Execute an aggregation operation.
* <br />
* The raw results will be mapped to the given {@code ouputType}. The name of the inputCollection is derived from the
* {@code inputType}. <br />
* {@code inputType}.
* <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -542,8 +603,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<O> Flux<O> aggregate(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation. <br />
* The raw results will be mapped to the given entity class. <br />
* Execute an aggregation operation.
* <br />
* The raw results will be mapped to the given entity class.
* <br />
* Aggregation streaming cannot be used with {@link AggregationOptions#isExplain() aggregation explain} nor with
* {@link AggregationOptions#getCursorBatchSize()}. Enabling explanation mode or setting batch size cause
* {@link IllegalArgumentException}.
@@ -693,6 +756,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* fields specification. Must not be {@literal null}.
* @param replacement the replacement document. Must not be {@literal null}.
* @return the converted object that was updated or {@link Mono#empty()}, if not found.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 2.1
*/
default <T> Mono<T> findAndReplace(Query query, T replacement) {
@@ -732,6 +797,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @return the converted object that was updated or {@link Mono#empty()}, if not found. Depending on the value of
* {@link FindAndReplaceOptions#isReturnNew()} this will either be the object as it was before the update or
* as it is after the update.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 2.1
*/
default <T> Mono<T> findAndReplace(Query query, T replacement, FindAndReplaceOptions options) {
@@ -802,6 +869,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @return the converted object that was updated or {@link Mono#empty()}, if not found. Depending on the value of
* {@link FindAndReplaceOptions#isReturnNew()} this will either be the object as it was before the update or
* as it is after the update.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given replacement value.
* @since 2.1
*/
default <S, T> Mono<T> findAndReplace(Query query, S replacement, FindAndReplaceOptions options, Class<S> entityType,
@@ -838,8 +907,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
* database. <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -855,7 +926,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -873,96 +945,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @see #exactCount(Query, Class)
* @see #estimatedCount(Class)
*/
Mono<Long> count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @see #estimatedCount(String)
* @see #exactCount(Query, String)
*/
Mono<Long> count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method may choose to use {@link #estimatedCount(Class)} for empty queries instead of running an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} which may have an impact on performance.
*
* @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* {@literal null}.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #estimatedCount(String)
* @see #exactCount(Query, Class, String)
*/
Mono<Long> count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @since 3.1
*/
default Mono<Long> estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics. <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @since 3.1
*/
Mono<Long> estimatedCount(String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
* count all matches.
* <br />
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
@@ -973,11 +957,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
* @since 3.4
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
default Mono<Long> exactCount(Query query, Class<?> entityClass) {
return exactCount(query, entityClass, getCollectionName(entityClass));
}
Mono<Long> count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
@@ -986,7 +969,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* count all matches.
* <br />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
@@ -997,11 +981,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @see #count(Query, Class, String)
* @since 3.4
*/
default Mono<Long> exactCount(Query query, String collectionName) {
return exactCount(query, null, collectionName);
}
Mono<Long> count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
@@ -1009,7 +990,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Query {@link Query#getSkip() offset} and {@link Query#getLimit() limit} can have direct
* influence on the resulting number of documents found as those values are passed on to the server and potentially
* limit the range and order within which the server performs the count operation. Use an {@literal unpaged} query to
* count all matches. <br />
* count all matches.
* <br />
* This method uses an
* {@link com.mongodb.reactivestreams.client.MongoCollection#countDocuments(org.bson.conversions.Bson, com.mongodb.client.model.CountOptions)
* aggregation execution} even for empty {@link Query queries} which may have an impact on performance, but guarantees
@@ -1021,18 +1003,51 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return the count of matching documents.
* @since 3.4
*/
Mono<Long> exactCount(Query query, @Nullable Class<?> entityClass, String collectionName);
Mono<Long> count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
* <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param entityClass must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @since 3.1
*/
default Mono<Long> estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
* <br />
* Please make sure to read the MongoDB reference documentation about limitations on eg. sharded cluster or inside
* transactions.
*
* @param collectionName must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @since 3.1
*/
Mono<Long> estimatedCount(String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details. <br />
* Type Conversion"</a> for more details.
* <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1040,13 +1055,17 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return the inserted object.
* @throws IllegalArgumentException in case the {@code objectToSave} is collection-like.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
<T> Mono<T> insert(T objectToSave);
/**
* Insert the object into the specified collection. <br />
* Insert the object into the specified collection.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
* <br />
* The {@code objectToSave} must not be collection-like.
@@ -1063,7 +1082,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*
* @param batchToSave the batch of objects to save. Must not be {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the inserted objects .
* @return the inserted objects.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
<T> Flux<T> insert(Collection<? extends T> batchToSave, Class<?> entityClass);
@@ -1082,17 +1103,22 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*
* @param objectsToSave the list of objects to save. Must not be {@literal null}.
* @return the saved objects.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} for the given objects.
*/
<T> Flux<T> insertAll(Collection<? extends T> objectsToSave);
/**
* Insert the object into the collection for the entity type of the object to save. <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. <br />
* Insert the object into the collection for the entity type of the object to save.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details. <br />
* Type Conversion"</a> for more details.
* <br />
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
@@ -1106,6 +1132,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param batchToSave the publisher which provides objects to save. Must not be {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the inserted objects.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} for the type.
*/
<T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, Class<?> entityClass);
@@ -1129,32 +1157,38 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'. <br />
* object is not already present, that is an 'upsert'.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's
* Type Conversion"</a> for more details. <br />
* Type Conversion"</a> for more details.
* <br />
* The {@code objectToSave} must not be collection-like.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return the saved object.
* @throws IllegalArgumentException in case the {@code objectToSave} is collection-like.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
<T> Mono<T> save(T objectToSave);
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'. <br />
* is an 'upsert'.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.
@@ -1165,30 +1199,34 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'. <br />
* object is not already present, that is an 'upsert'.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation"> Spring's Type
* Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation"> Spring's Type Conversion</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return the saved object.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
<T> Mono<T> save(Mono<? extends T> objectToSave);
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'. <br />
* is an 'upsert'.
* <br />
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* If your object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API.
* See <a href="https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type Conversion</a> for more details.
*
* @param objectToSave the object to store in the collReactiveMongoOperationsection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.
@@ -1208,6 +1246,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* the existing object. Must not be {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @since 3.0
* @see Update
* @see AggregationUpdate
@@ -1262,6 +1302,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* the existing. Must not be {@literal null}.
* @param entityClass class that determines the collection to use.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @since 3.0
* @see Update
* @see AggregationUpdate
@@ -1317,6 +1359,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param entityClass class of the pojo to be operated on. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
* @since 3.0
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
* @see Update
* @see AggregationUpdate
*/
@@ -1363,6 +1407,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*
* @param object must not be {@literal null}.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
Mono<DeleteResult> remove(Object object);
@@ -1380,6 +1426,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*
* @param objectToRemove must not be {@literal null}.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given object type.
*/
Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove);
@@ -1399,6 +1447,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param query the query document that specifies the criteria used to remove a record.
* @param entityClass class that determines the collection to use.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
Mono<DeleteResult> remove(Query query, Class<?> entityClass);
@@ -1442,6 +1492,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass class of the pojo to be operated on.
* @return the {@link Flux} converted objects deleted by this operation.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
<T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass);
@@ -1461,9 +1513,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}. <br />
* {@link Subscription#cancel() canceled}.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1471,6 +1525,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* specification.
* @param entityClass the parametrized type of the returned {@link Flux}.
* @return the {@link Flux} of converted objects.
* @throws org.springframework.data.mapping.MappingException if the target collection name cannot be
* {@link #getCollectionName(Class) derived} from the given type.
*/
<T> Flux<T> tail(Query query, Class<T> entityClass);
@@ -1478,9 +1534,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}. <br />
* {@link Subscription#cancel() canceled}.
* <br />
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used. <br />
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <br />
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
@@ -1496,9 +1554,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> for all events in
* the configured default database via the reactive infrastructure. Use the optional provided {@link Aggregation} to
* filter events. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}. <br />
* {@link Subscription#cancel() canceled}.
* <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1518,9 +1578,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> for all events in
* the given collection via the reactive infrastructure. Use the optional provided {@link Aggregation} to filter
* events. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}. <br />
* {@link Subscription#cancel() canceled}.
* <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1541,9 +1603,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Subscribe to a MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Stream</a> via the reactive
* infrastructure. Use the optional provided {@link Aggregation} to filter events. The stream will not be completed
* unless the {@link org.reactivestreams.Subscription} is {@link Subscription#cancel() canceled}. <br />
* unless the {@link org.reactivestreams.Subscription} is {@link Subscription#cancel() canceled}.
* <br />
* The {@link ChangeStreamEvent#getBody()} is mapped to the {@literal resultType} while the
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload. <br />
* {@link ChangeStreamEvent#getRaw()} contains the unmodified payload.
* <br />
* Use {@link ChangeStreamOptions} to set arguments like {@link ChangeStreamOptions#getResumeToken() the resumseToken}
* for resuming change streams.
*
@@ -1573,9 +1637,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param options additional options like output collection. Must not be {@literal null}.
* @return a {@link Flux} emitting the result document sequence. Never {@literal null}.
* @since 2.1
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, Class<T> resultType, String mapFunction,
String reduceFunction, MapReduceOptions options);
@@ -1593,9 +1655,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param options additional options like output collection. Must not be {@literal null}.
* @return a {@link Flux} emitting the result document sequence. Never {@literal null}.
* @since 2.1
* @deprecated since 3.4 in favor of {@link #aggregate(TypedAggregation, Class)}.
*/
@Deprecated
<T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, String inputCollectionName, Class<T> resultType,
String mapFunction, String reduceFunction, MapReduceOptions options);
@@ -1611,6 +1671,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*
* @param entityClass must not be {@literal null}.
* @return never {@literal null}.
* @throws org.springframework.data.mapping.MappingException if the collection name cannot be derived from the type.
* @since 2.1
*/
String getCollectionName(Class<?> entityClass);

View File

@@ -41,6 +41,10 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
this.tempate = tempate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation#remove(java.lang.Class)
*/
@Override
public <T> ReactiveRemove<T> remove(Class<T> domainType) {
@@ -64,6 +68,10 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
this.collection = collection;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.RemoveWithCollection#inCollection(String)
*/
@Override
public RemoveWithQuery<T> inCollection(String collection) {
@@ -72,6 +80,10 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return new ReactiveRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.RemoveWithQuery#matching(org.springframework.data.mongodb.core.Query)
*/
@Override
public TerminatingRemove<T> matching(Query query) {
@@ -80,6 +92,10 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return new ReactiveRemoveSupport<>(template, domainType, query, collection);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.TerminatingRemove#all()
*/
@Override
public Mono<DeleteResult> all() {
@@ -88,6 +104,10 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return template.doRemove(collectionName, query, domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveRemoveOperation.TerminatingRemove#findAndRemove()
*/
@Override
public Flux<T> findAndRemove() {

View File

@@ -42,6 +42,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
this.template = template;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation#update(java.lang.Class)
*/
@Override
public <T> ReactiveUpdate<T> update(Class<T> domainType) {
@@ -79,6 +83,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithUpdate#apply(org.springframework.data.mongodb.core.query.UpdateDefinition)
*/
@Override
public TerminatingUpdate<T> apply(org.springframework.data.mongodb.core.query.UpdateDefinition update) {
@@ -88,6 +96,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithCollection#inCollection(java.lang.String)
*/
@Override
public UpdateWithQuery<T> inCollection(String collection) {
@@ -97,16 +109,28 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#first()
*/
@Override
public Mono<UpdateResult> first() {
return doUpdate(false, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#upsert()
*/
@Override
public Mono<UpdateResult> upsert() {
return doUpdate(true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingFindAndModify#findAndModify()
*/
@Override
public Mono<T> findAndModify() {
@@ -117,6 +141,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
collectionName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingFindAndReplace#findAndReplace()
*/
@Override
public Mono<T> findAndReplace() {
return template.findAndReplace(query, replacement,
@@ -124,6 +152,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
getCollectionName(), targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithQuery#matching(org.springframework.data.mongodb.core.Query)
*/
@Override
public UpdateWithUpdate<T> matching(Query query) {
@@ -133,11 +165,19 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.TerminatingUpdate#all()
*/
@Override
public Mono<UpdateResult> all() {
return doUpdate(true, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndModifyWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndModifyOptions)
*/
@Override
public TerminatingFindAndModify<T> withOptions(FindAndModifyOptions options) {
@@ -147,6 +187,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.UpdateWithUpdate#replaceWith(java.lang.Object)
*/
@Override
public FindAndReplaceWithProjection<T> replaceWith(T replacement) {
@@ -156,6 +200,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
findAndReplaceOptions, replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithOptions#withOptions(org.springframework.data.mongodb.core.FindAndReplaceOptions)
*/
@Override
public FindAndReplaceWithProjection<T> withOptions(FindAndReplaceOptions options) {
@@ -165,6 +213,10 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
replacement, targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveUpdateOperation.FindAndReplaceWithProjection#as(java.lang.Class)
*/
@Override
public <R> FindAndReplaceWithOptions<R> as(Class<R> resultType) {

View File

@@ -75,16 +75,28 @@ public class SimpleMongoClientDatabaseFactory extends MongoDatabaseFactorySuppor
super(mongoClient, databaseName, mongoInstanceCreated, new MongoExceptionTranslator());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return getMongoClient().startSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#closeClient()
*/
@Override
protected void closeClient() {
getMongoClient().close();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#doGetMongoDatabase(java.lang.String)
*/
@Override
protected MongoDatabase doGetMongoDatabase(String dbName) {
return getMongoClient().getDatabase(dbName);

View File

@@ -0,0 +1,74 @@
/*
* Copyright 2018-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.ConnectionString;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;
import com.mongodb.client.MongoDatabase;
/**
* Factory to create {@link MongoDatabase} instances from a {@link MongoClient} instance.
*
* @author Christoph Strobl
* @since 2.1
* @deprecated since 3.0, use {@link SimpleMongoClientDatabaseFactory} instead.
*/
@Deprecated
public class SimpleMongoClientDbFactory extends SimpleMongoClientDatabaseFactory {
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance for the given {@code connectionString}.
*
* @param connectionString connection coordinates for a database connection. Must contain a database name and must not
* be {@literal null} or empty.
* @see <a href="https://docs.mongodb.com/manual/reference/connection-string/">MongoDB Connection String reference</a>
*/
public SimpleMongoClientDbFactory(String connectionString) {
this(new ConnectionString(connectionString));
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param connectionString connection coordinates for a database connection. Must contain also a database name and not
* be {@literal null}.
*/
public SimpleMongoClientDbFactory(ConnectionString connectionString) {
this(MongoClients.create(connectionString), connectionString.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
*/
public SimpleMongoClientDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated
*/
private SimpleMongoClientDbFactory(MongoClient mongoClient, String databaseName, boolean mongoInstanceCreated) {
super(mongoClient, databaseName, mongoInstanceCreated);
}
}

View File

@@ -97,10 +97,18 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getMongoDatabase()
*/
public Mono<MongoDatabase> getMongoDatabase() throws DataAccessException {
return getMongoDatabase(databaseName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getMongoDatabase(java.lang.String)
*/
public Mono<MongoDatabase> getMongoDatabase(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");
@@ -125,20 +133,36 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getExceptionTranslator()
*/
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getCodecRegistry()
*/
@Override
public CodecRegistry getCodecRegistry() {
return this.mongo.getDatabase(databaseName).getCodecRegistry();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public Mono<ClientSession> getSession(ClientSessionOptions options) {
return Mono.from(mongo.startSession(options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public ReactiveMongoDatabaseFactory withSession(ClientSession session) {
return new ClientSessionBoundMongoDbFactory(session, this);
@@ -162,36 +186,64 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase()
*/
@Override
public Mono<MongoDatabase> getMongoDatabase() throws DataAccessException {
return delegate.getMongoDatabase().map(this::decorateDatabase);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getMongoDatabase(java.lang.String)
*/
@Override
public Mono<MongoDatabase> getMongoDatabase(String dbName) throws DataAccessException {
return delegate.getMongoDatabase(dbName).map(this::decorateDatabase);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getCodecRegistry()
*/
@Override
public CodecRegistry getCodecRegistry() {
return delegate.getCodecRegistry();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public Mono<ClientSession> getSession(ClientSessionOptions options) {
return delegate.getSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public ReactiveMongoDatabaseFactory withSession(ClientSession session) {
return delegate.withSession(session);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.ReactiveMongoDatabaseFactory#isTransactionActive()
*/
@Override
public boolean isTransactionActive() {
return session != null && session.hasActiveTransaction();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2022 the original author or authors.
* Copyright 2016-2022. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -44,6 +44,9 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
this.value = value;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return toDocument(this.value, context);

View File

@@ -334,6 +334,9 @@ public class AccumulatorOperators {
return new Sum(append(value));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -414,6 +417,9 @@ public class AccumulatorOperators {
return new Avg(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -494,6 +500,9 @@ public class AccumulatorOperators {
return new Max(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -574,6 +583,9 @@ public class AccumulatorOperators {
return new Min(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -654,6 +666,9 @@ public class AccumulatorOperators {
return new StdDevPop(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
@@ -734,6 +749,9 @@ public class AccumulatorOperators {
return new StdDevSamp(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDocument(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {

View File

@@ -99,6 +99,10 @@ public class AddFieldsOperation extends DocumentEnhancingOperation {
return new AddFieldsOperationBuilder(getValueMap());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.DocumentEnhancingOperation#mongoOperator()
*/
@Override
protected String mongoOperator() {
return "$addFields";

View File

@@ -226,7 +226,8 @@ public class Aggregation {
}
/**
* Obtain an {@link AddFieldsOperationBuilder builder} instance to create a new {@link AddFieldsOperation}. <br />
* Obtain an {@link AddFieldsOperationBuilder builder} instance to create a new {@link AddFieldsOperation}.
* <br />
* Starting in version 4.2, MongoDB adds a new aggregation pipeline stage {@link AggregationUpdate#set $set} that is
* an alias for {@code $addFields}.
*
@@ -434,6 +435,18 @@ public class Aggregation {
return new SortByCountOperation(groupAndSortExpression);
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param elementsToSkip must not be less than zero.
* @return new instance of {@link SkipOperation}.
* @deprecated prepare to get this one removed in favor of {@link #skip(long)}.
*/
@Deprecated
public static SkipOperation skip(int elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
@@ -712,7 +725,8 @@ public class Aggregation {
}
/**
* Converts this {@link Aggregation} specification to a {@link Document}. <br />
* Converts this {@link Aggregation} specification to a {@link Document}.
* <br />
* MongoDB requires as of 3.6 cursor-based aggregation. Use {@link #toPipeline(AggregationOperationContext)} to render
* an aggregation pipeline.
*
@@ -727,6 +741,10 @@ public class Aggregation {
return options.applyAndReturnPotentiallyChangedCommand(command);
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return SerializationUtils.serializeToJsonSafely(toDocument("__collection__", DEFAULT_CONTEXT));
@@ -770,6 +788,10 @@ public class Aggregation {
return false;
}
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/
@Override
public String toString() {
return PREFIX.concat(name());

View File

@@ -0,0 +1,108 @@
/*
* Copyright 2015-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.bson.Document;
import org.springframework.util.Assert;
/**
* An enum of supported {@link AggregationExpression}s in aggregation pipeline stages.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.7
* @deprecated since 1.10. Please use {@link ArithmeticOperators} and {@link ComparisonOperators} instead.
*/
@Deprecated
public enum AggregationFunctionExpressions {
SIZE, CMP, EQ, GT, GTE, LT, LTE, NE, SUBTRACT, ADD, MULTIPLY;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
*
* @param parameters must not be {@literal null}
* @return new instance of {@link AggregationExpression}.
*/
public AggregationExpression of(Object... parameters) {
Assert.notNull(parameters, "Parameters must not be null!");
return new FunctionExpression(name().toLowerCase(), parameters);
}
/**
* An {@link AggregationExpression} representing a function call.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.7
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final List<Object> values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
*
* @param name must not be {@literal null} or empty.
* @param values must not be {@literal null}.
*/
public FunctionExpression(String name, Object[] values) {
Assert.hasText(name, "Name must not be null!");
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = Arrays.asList(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Expression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.size());
for (Object value : values) {
args.add(unpack(value, context));
}
return new Document("$" + name, args);
}
private static Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDocument(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
return value;
}
}
}

View File

@@ -80,16 +80,28 @@ class AggregationOperationRenderer {
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(org.bson.Document, java.lang.Class)
*/
@Override
public Document getMappedObject(Document document, @Nullable Class<?> type) {
return document;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new DirectFieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new DirectFieldReference(new ExposedField(new AggregationField(name), true));

View File

@@ -339,6 +339,9 @@ public class AggregationOptions {
return !maxTime.isZero() && !maxTime.isNegative();
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDocument().toJson();

View File

@@ -77,6 +77,10 @@ public class AggregationResults<T> implements Iterable<T> {
return mappedResults.size() == 1 ? mappedResults.get(0) : null;
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<T> iterator() {
return mappedResults.iterator();
}

View File

@@ -64,6 +64,9 @@ public class AggregationSpELExpression implements AggregationExpression {
return new AggregationSpELExpression(expressionString, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return (Document) TRANSFORMER.transform(rawExpression, context, parameters);

View File

@@ -242,26 +242,48 @@ public class AggregationUpdate extends Aggregation implements UpdateDefinition {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#isIsolated()
*/
@Override
public Boolean isIsolated() {
return isolated;
}
/*
* Returns a update document containing the update pipeline.
* The resulting document needs to be unwrapped to be used with update operations.
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getUpdateObject()
*/
@Override
public Document getUpdateObject() {
return new Document("", toPipeline(Aggregation.DEFAULT_CONTEXT));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#modifies(java.lang.String)
*/
@Override
public boolean modifies(String key) {
return keysTouched.contains(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#inc(java.lang.String)
*/
@Override
public void inc(String key) {
set(new SetOperation(key, ArithmeticOperators.valueOf(key).add(1)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.UpdateDefinition#getArrayFilters()
*/
@Override
public List<ArrayFilter> getArrayFilters() {
return Collections.emptyList();

View File

@@ -800,26 +800,6 @@ public class ArithmeticOperators {
return usesFieldRef() ? Cosh.coshOf(fieldReference, unit) : Cosh.coshOf(expression, unit);
}
/**
* Creates new {@link AggregationExpression} that calculates the inverse cosine of a numeric value.
*
* @return new instance of {@link ACos}.
* @since 3.4
*/
public ACos acos() {
return usesFieldRef() ? ACos.acosOf(fieldReference) : ACos.acosOf(expression);
}
/**
* Creates new {@link AggregationExpression} that calculates the inverse hyperbolic cosine of a numeric value.
*
* @return new instance of {@link ACosh}.
* @since 3.4
*/
public ACosh acosh() {
return usesFieldRef() ? ACosh.acoshOf(fieldReference) : ACosh.acoshOf(expression);
}
/**
* Creates new {@link AggregationExpression} that calculates the tangent of a numeric value given in
* {@link AngularUnit#RADIANS radians}.
@@ -2482,6 +2462,7 @@ public class ArithmeticOperators {
}
}
/**
* An {@link AggregationExpression expression} that calculates the cosine of a value that is measured in radians.
*
@@ -2684,108 +2665,6 @@ public class ArithmeticOperators {
}
}
/**
* An {@link AggregationExpression expression} that calculates the inverse cosine of a value.
*
* @author Divya Srivastava
* @since 3.4
*/
public static class ACos extends AbstractAggregationExpression {
private ACos(Object value) {
super(value);
}
/**
* Creates a new {@link AggregationExpression} that calculates the inverse cosine of a value.
*
* @param fieldReference the name of the {@link Field field} that resolves to a numeric value.
* @return new instance of {@link ACos}.
*/
public static ACos acosOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new ACos(Fields.field(fieldReference));
}
/**
* Creates a new {@link AggregationExpression} that calculates the inverse cosine of a value.
* <br />
*
* @param expression the {@link AggregationExpression expression} that resolves to a numeric value.
* @return new instance of {@link ACos}.
*/
public static ACos acosOf(AggregationExpression expression) {
return new ACos(expression);
}
/**
* Creates a new {@link AggregationExpression} that calculates the inverse cosine of a value.
*
* @param value anything ({@link Field field}, {@link AggregationExpression expression}, ...) that resolves to a
* numeric value.
* @return new instance of {@link ACos}.
*/
public static ACos acosOf(Number value) {
return new ACos(value);
}
@Override
protected String getMongoMethod() {
return "$acos";
}
}
/**
* An {@link AggregationExpression expression} that calculates the inverse hyperbolic cosine of a value.
*
* @author Divya Srivastava
* @since 3.4
*/
public static class ACosh extends AbstractAggregationExpression {
private ACosh(Object value) {
super(value);
}
/**
* Creates a new {@link AggregationExpression} that calculates the inverse hyperbolic cosine of a value.
*
* @param fieldReference the name of the {@link Field field} that resolves to a numeric value.
* @return new instance of {@link ACosh}.
*/
public static ACosh acoshOf(String fieldReference) {
return new ACosh(Fields.field(fieldReference));
}
/**
* Creates a new {@link AggregationExpression} that calculates the inverse hyperbolic cosine of a value.
* <br />
*
* @param expression the {@link AggregationExpression expression} that resolves to a numeric value.
* @return new instance of {@link ACosh}.
*/
public static ACosh acoshOf(AggregationExpression expression) {
return new ACosh(expression);
}
/**
* Creates a new {@link AggregationExpression} that calculates the inverse hyperbolic cosine of a value.
*
* @param value anything ({@link Field field}, {@link AggregationExpression expression}, ...) that resolves to a
* numeric value.
* @return new instance of {@link ACosh}.
*/
public static ACosh acoshOf(Object value) {
return new ACosh(value);
}
@Override
protected String getMongoMethod() {
return "$acosh";
}
}
/**
* An {@link AggregationExpression expression} that calculates the tangent of a value that is measured in radians.
*
@@ -2975,6 +2854,7 @@ public class ArithmeticOperators {
return new ATan2((Collections.singletonList(expression)));
}
/**
* Creates a new {@link AggregationExpression} that calculates the inverse tangent of of y / x, where y and x are
* the first and second values passed to the expression respectively.

View File

@@ -35,7 +35,6 @@ import org.springframework.util.Assert;
* @author Christoph Strobl
* @author Mark Paluch
* @author Shashank Sharma
* @author Divya Srivastava
* @since 1.0
*/
public class ArrayOperators {
@@ -363,38 +362,6 @@ public class ArrayOperators {
return usesExpression() ? ArrayToObject.arrayValueOfToObject(expression) : ArrayToObject.arrayToObject(values);
}
/**
* Creates new {@link AggregationExpression} that return the first element in the associated array.
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @return new instance of {@link First}.
* @since 3.4
*/
public First first() {
if (usesFieldRef()) {
return First.firstOf(fieldReference);
}
return usesExpression() ? First.firstOf(expression) : First.first(values);
}
/**
* Creates new {@link AggregationExpression} that return the last element in the given array.
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @return new instance of {@link Last}.
* @since 3.4
*/
public Last last() {
if (usesFieldRef()) {
return Last.lastOf(fieldReference);
}
return usesExpression() ? Last.lastOf(expression) : Last.last(values);
}
/**
* @author Christoph Strobl
@@ -645,6 +612,10 @@ public class ArrayOperators {
return new FilterExpressionBuilder().filter(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(final AggregationOperationContext context) {
return toFilter(ExposedFields.from(as), context);
@@ -765,6 +736,10 @@ public class ArrayOperators {
return new FilterExpressionBuilder();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.InputBuilder#filter(java.util.List)
*/
@Override
public AsBuilder filter(List<?> array) {
@@ -773,6 +748,10 @@ public class ArrayOperators {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.InputBuilder#filter(org.springframework.data.mongodb.core.aggregation.Field)
*/
@Override
public AsBuilder filter(Field field) {
@@ -781,6 +760,10 @@ public class ArrayOperators {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.AsBuilder#as(java.lang.String)
*/
@Override
public ConditionBuilder as(String variableName) {
@@ -789,6 +772,10 @@ public class ArrayOperators {
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.ConditionBuilder#by(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public Filter by(AggregationExpression condition) {
@@ -797,6 +784,10 @@ public class ArrayOperators {
return filter;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.ConditionBuilder#by(java.lang.String)
*/
@Override
public Filter by(String expression) {
@@ -805,6 +796,10 @@ public class ArrayOperators {
return filter;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.ConditionBuilder#by(org.bson.Document)
*/
@Override
public Filter by(Document expression) {
@@ -1249,6 +1244,9 @@ public class ArrayOperators {
this.reduceExpressions = reduceExpressions;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -1435,6 +1433,9 @@ public class ArrayOperators {
};
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return new Document(propertyName, aggregationExpression.toDocument(context));
@@ -1802,117 +1803,13 @@ public class ArrayOperators {
return new ArrayToObject(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AbstractAggregationExpression#getMongoMethod()
*/
@Override
protected String getMongoMethod() {
return "$arrayToObject";
}
}
/**
* {@link AggregationExpression} for {@code $first} that returns the first element in an array. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @author Divya Srivastava
* @author Christoph Strobl
* @since 3.4
*/
public static class First extends AbstractAggregationExpression {
private First(Object value) {
super(value);
}
/**
* Returns the first element in the given array.
*
* @param array must not be {@literal null}.
* @return new instance of {@link First}.
*/
public static First first(Object array) {
return new First(array);
}
/**
* Returns the first element in the array pointed to by the given {@link Field field reference}.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link First}.
*/
public static First firstOf(String fieldReference) {
return new First(Fields.field(fieldReference));
}
/**
* Returns the first element of the array computed by the given {@link AggregationExpression expression}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link First}.
*/
public static First firstOf(AggregationExpression expression) {
return new First(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AbstractAggregationExpression#getMongoMethod()
*/
@Override
protected String getMongoMethod() {
return "$first";
}
}
/**
* {@link AggregationExpression} for {@code $last} that returns the last element in an array. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.4 or later.
*
* @author Divya Srivastava
* @author Christoph Strobl
* @since 3.4
*/
public static class Last extends AbstractAggregationExpression {
private Last(Object value) {
super(value);
}
/**
* Returns the last element in the given array.
*
* @param array must not be {@literal null}.
* @return new instance of {@link Last}.
*/
public static Last last(Object array) {
return new Last(array);
}
/**
* Returns the last element in the array pointed to by the given {@link Field field reference}.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Last}.
*/
public static Last lastOf(String fieldReference) {
return new Last(Fields.field(fieldReference));
}
/**
* Returns the last element of the array computed buy the given {@link AggregationExpression expression}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Last}.
*/
public static Last lastOf(AggregationExpression expression) {
return new Last(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AbstractAggregationExpression#getMongoMethod()
*/
@Override
protected String getMongoMethod() {
return "$last";
}
}
}

View File

@@ -88,6 +88,9 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -104,6 +107,10 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
return new Document(getOperator(), options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return "$bucketAuto";
@@ -137,21 +144,33 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
return new BucketAutoOperation(this, buckets, granularity.getMongoRepresentation());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketAutoOperation newBucketOperation(Outputs outputs) {
return new BucketAutoOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketAutoOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketAutoOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketAutoOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(String fieldName) {
return new BucketAutoOperationOutputBuilder(Fields.field(fieldName), this);
@@ -173,6 +192,9 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
@@ -201,6 +223,9 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
@@ -245,6 +270,9 @@ public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperat
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GranularitytoMongoGranularity()
*/
@Override
public String getMongoRepresentation() {
return granularity;

View File

@@ -84,6 +84,9 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
this.defaultBucket = defaultBucket;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -100,6 +103,10 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
return new Document(getOperator(), options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return "$bucket";
@@ -136,21 +143,33 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
return new BucketOperation(this, newBoundaries, defaultBucket);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketOperation newBucketOperation(Outputs outputs) {
return new BucketOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketOperationOutputBuilder andOutput(String fieldName) {
return new BucketOperationOutputBuilder(Fields.field(fieldName), this);
@@ -172,6 +191,9 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);
@@ -199,6 +221,9 @@ public class BucketOperation extends BucketOperationSupport<BucketOperation, Buc
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);

View File

@@ -141,6 +141,9 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
});
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -156,6 +159,9 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
return document;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return outputs.asExposedFields();
@@ -448,6 +454,9 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
return outputs.isEmpty();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -531,6 +540,10 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
this.values = operationOutput.values;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -623,6 +636,9 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return (Document) TRANSFORMER.transform(expression, context, params);
@@ -649,6 +665,9 @@ public abstract class BucketOperationSupport<T extends BucketOperationSupport<T,
this.expression = expression;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return expression.toDocument(context);

View File

@@ -275,6 +275,10 @@ public class ConditionalOperators {
return new IfNullOperatorBuilder().ifNull(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -404,6 +408,9 @@ public class ConditionalOperators {
return new IfNullOperatorBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(java.lang.String)
*/
public ThenBuilder ifNull(String fieldReference) {
Assert.hasText(fieldReference, "FieldReference name must not be null or empty!");
@@ -411,6 +418,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder ifNull(AggregationExpression expression) {
@@ -429,16 +439,25 @@ public class ConditionalOperators {
return ifNull(expression);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#then(java.lang.Object)
*/
public IfNull then(Object value) {
return new IfNull(conditions, value);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(java.lang.String)
*/
public IfNull thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IfNull(conditions, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
public IfNull thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
@@ -524,6 +543,9 @@ public class ConditionalOperators {
};
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -611,6 +633,10 @@ public class ConditionalOperators {
this.otherwiseValue = otherwiseValue;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -649,10 +675,7 @@ public class ConditionalOperators {
if (value instanceof CriteriaDefinition) {
Document mappedObject = context.getMappedObject(((CriteriaDefinition) value).getCriteriaObject());
List<Object> clauses = new ArrayList<Object>();
clauses.addAll(getClauses(context, mappedObject));
List<Object> clauses = getClauses(context, mappedObject);
return clauses.size() == 1 ? clauses.get(0) : clauses;
}
@@ -679,7 +702,9 @@ public class ConditionalOperators {
if (predicate instanceof List) {
List<Object> args = new ArrayList<Object>();
List<?> predicates = (List<?>) predicate;
List<Object> args = new ArrayList<Object>(predicates.size());
for (Object clause : (List<?>) predicate) {
if (clause instanceof Document) {
args.addAll(getClauses(context, (Document) clause));
@@ -697,14 +722,14 @@ public class ConditionalOperators {
continue;
}
List<Object> args = new ArrayList<Object>();
List<Object> args = new ArrayList<Object>(2);
args.add("$" + key);
args.add(nested.get(s));
clauses.add(new Document(s, args));
}
} else if (!isKeyword(key)) {
List<Object> args = new ArrayList<Object>();
List<Object> args = new ArrayList<Object>(2);
args.add("$" + key);
args.add(predicate);
clauses.add(new Document("$eq", args));
@@ -889,6 +914,9 @@ public class ConditionalOperators {
return new ConditionalExpressionBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.bson.Document)
*/
@Override
public ConditionalExpressionBuilder when(Document booleanExpression) {
@@ -898,6 +926,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public ThenBuilder when(CriteriaDefinition criteria) {
@@ -906,6 +937,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder when(AggregationExpression expression) {
@@ -914,6 +948,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(java.lang.String)
*/
@Override
public ThenBuilder when(String booleanField) {
@@ -922,6 +959,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#then(java.lang.Object)
*/
@Override
public OtherwiseBuilder then(Object thenValue) {
@@ -930,6 +970,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(java.lang.String)
*/
@Override
public OtherwiseBuilder thenValueOf(String fieldReference) {
@@ -938,6 +981,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public OtherwiseBuilder thenValueOf(AggregationExpression expression) {
@@ -946,6 +992,9 @@ public class ConditionalOperators {
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwise(java.lang.Object)
*/
@Override
public Cond otherwise(Object otherwiseValue) {
@@ -953,6 +1002,9 @@ public class ConditionalOperators {
return new Cond(condition, thenValue, otherwiseValue);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(java.lang.String)
*/
@Override
public Cond otherwiseValueOf(String fieldReference) {
@@ -960,6 +1012,9 @@ public class ConditionalOperators {
return new Cond(condition, thenValue, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public Cond otherwiseValueOf(AggregationExpression expression) {

View File

@@ -43,6 +43,9 @@ public class CountOperation implements FieldsExposingAggregationOperation {
this.fieldName = fieldName;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
return new Document(getOperator(), fieldName);
@@ -53,6 +56,9 @@ public class CountOperation implements FieldsExposingAggregationOperation {
return "$count";
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from(new ExposedField(fieldName, true));

View File

@@ -91,7 +91,8 @@ public class DateOperators {
}
/**
* Take the given value as date. <br />
* Take the given value as date.
* <br />
* This can be one of:
* <ul>
* <li>{@link java.util.Date}</li>
@@ -189,7 +190,7 @@ public class DateOperators {
* representing an Olson Timezone Identifier or UTC Offset.
*
* @param value the plain timezone {@link String}, a {@link Field} holding the timezone or an
* {@link AggregationExpression} resulting in the timezone.
* {@link AggregationExpression} resulting in the timezone.
* @return new instance of {@link Timezone}.
*/
public static Timezone valueOf(Object value) {
@@ -332,7 +333,8 @@ public class DateOperators {
}
/**
* Creates new {@link DateOperatorFactory} for given {@code value} that resolves to a Date. <br />
* Creates new {@link DateOperatorFactory} for given {@code value} that resolves to a Date.
* <br />
* <ul>
* <li>{@link java.util.Date}</li>
* <li>{@link java.util.Calendar}</li>
@@ -2086,6 +2088,20 @@ public class DateOperators {
return second(expression);
}
/**
* Set the {@literal millisecond} to the given value which must resolve to a value in range {@code 0 - 999}. Can be
* a simple value, {@link Field field reference} or {@link AggregationExpression expression}.
*
* @param millisecond must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal millisecond} is {@literal null}
* @deprecated since 3.2, use {@link #millisecond(Object)} instead.
*/
@Deprecated
default T milliseconds(Object millisecond) {
return millisecond(millisecond);
}
/**
* Set the {@literal millisecond} to the given value which must resolve to a value in range {@code 0 - 999}. Can be
* a simple value, {@link Field field reference} or {@link AggregationExpression expression}.
@@ -2097,6 +2113,19 @@ public class DateOperators {
*/
T millisecond(Object millisecond);
/**
* Set the {@literal millisecond} to the value resolved by following the given {@link Field field reference}.
*
* @param fieldReference must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal fieldReference} is {@literal null}.
* @deprecated since 3.2,use {@link #millisecondOf(String)} instead.
*/
@Deprecated
default T millisecondsOf(String fieldReference) {
return millisecondOf(fieldReference);
}
/**
* Set the {@literal millisecond} to the value resolved by following the given {@link Field field reference}.
*
@@ -2106,7 +2135,20 @@ public class DateOperators {
* @since 3.2
*/
default T millisecondOf(String fieldReference) {
return millisecond(Fields.field(fieldReference));
return milliseconds(Fields.field(fieldReference));
}
/**
* Set the {@literal millisecond} to the result of the given {@link AggregationExpression expression}.
*
* @param expression must not be {@literal null}.
* @return new instance.
* @throws IllegalArgumentException if given {@literal expression} is {@literal null}.
* @deprecated since 3.2, use {@link #millisecondOf(AggregationExpression)} instead.
*/
@Deprecated
default T millisecondsOf(AggregationExpression expression) {
return millisecondOf(expression);
}
/**
@@ -2118,7 +2160,7 @@ public class DateOperators {
* @since 3.2
*/
default T millisecondOf(AggregationExpression expression) {
return millisecond(expression);
return milliseconds(expression);
}
}
@@ -2129,7 +2171,7 @@ public class DateOperators {
* @author Matt Morrissette
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/</a>
* @since 2.1
*/
public static class DateFromParts extends TimezonedDateAggregationExpression implements DateParts<DateFromParts> {
@@ -2304,7 +2346,7 @@ public class DateOperators {
* @author Matt Morrissette
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromParts/</a>
* @since 2.1
*/
public static class IsoDateFromParts extends TimezonedDateAggregationExpression
@@ -2480,7 +2522,7 @@ public class DateOperators {
* @author Matt Morrissette
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateToParts/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateToParts/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateToParts/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateToParts/</a>
* @since 2.1
*/
public static class DateToParts extends TimezonedDateAggregationExpression {
@@ -2561,7 +2603,7 @@ public class DateOperators {
* @author Matt Morrissette
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromString/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromString/</a>
* "https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromString/">https://docs.mongodb.com/manual/reference/operator/aggregation/dateFromString/</a>
* @since 2.1
*/
public static class DateFromString extends TimezonedDateAggregationExpression {

View File

@@ -46,6 +46,10 @@ abstract class DocumentEnhancingOperation implements InheritsFieldsAggregationOp
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
@@ -67,6 +71,10 @@ abstract class DocumentEnhancingOperation implements InheritsFieldsAggregationOp
*/
protected abstract String mongoOperator();
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return mongoOperator();
@@ -79,6 +87,10 @@ abstract class DocumentEnhancingOperation implements InheritsFieldsAggregationOp
return this.valueMap;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return exposedFields;

Some files were not shown because too many files have changed in this diff Show More